• 1 Post
  • 177 Comments
Joined 1 year ago
cake
Cake day: August 15th, 2024

help-circle


  • The speed of light is far too slow for any potential alien to know we are even here. An alien 50 light years out that detected Hitler’s Olympic address (send in 1936 one of the first strong signals, intentional sending didn’t come until latter) still hasn’t got their response back to us yet! Even if there are aliens, there are only around 2000 stars at that distance and so odds are against them being that closer.

    that is before we start talking about how signals lose strength of distance. Most of the stars we see in the sky are in the same sub-arm of the spiral arm of the milky way that we are in - even stars don’t have enough energy to reach earth from farther out! (telescopes of course can detect a lot more, but still we are talking about star level energy here). I don’t know what the limits of detecting life on earth are, but I doubt even the most sensitive possible systems could have detected dinosaurs from the nearest star to us (excluding our sun). TV (that is strong signals but not directed) is detectable farther out, but even still we don’t gets much farther. As the world moves to low power digital communications there is less to detect. There are of course intentional “we are here” messages sent via directional radio telescope - but they are rely on our best guess of where aliens might be and so could miss a lot, and still those only reach so far - and have not gone very far.





  • Or is it better to save a few bucks now and save it for next year when something new comes out that is faster anyway. Maybe there is a new codec that matters in 3 years but nothing today supports: so either way you are forced to replace your server.

    There is no right answer, you are taking your chances when planning for the future. There are many computers more than 10 years old still working just fine in the world, and it is possible that whatever you buy today will be as well. We get enough press releases that we can predict what will happen next year close enough, but in 5 years we have much less information. There is no way to know if saving money is a good choice today or not. I can come up with scenarios either way.

    Look at power use. Often last generation hardware uses more power for the things you do today and so the few dollars you save today are made up with in the power bill over the next couple years. (though if you use that new hardware to do something the old couldn’t do the new will use more power!)

    If there is only a few dollars difference in price go for the best. However when there are hundreds or even thousands of dollars it becomes a harder decision.


  • Fiber makes a big difference with fruit. it slows down absorbtion on often sugars are locked in fiber needing time.

    Glucose affects the gi it is absorbed directly into the bloodstream. Futose cannot be used directly and so the liver processes it - no gi index applies.

    Sucrose generally implies no fiber and so the simplification works fine. With the added constraint that only half of the molecule is glucose and influences the gi index.

    that is as far as I know things so I need to stop. Even then I’ll stand corrected if an expert weighs in (though it is more likely the ‘expert’ is self proclaimed and really knows less than me so I place a high burdon of proff for correcting me despite this not being where I’m an expert)





  • It is subtly different - generally enough to by annoying, but not actually significant (it is annoying to use zfs on linux but not really hard). There are sometimes advantages and disadvantages, but they are often obscure things that probably won’t matter to you (just like many people didn’t notice the switch from X to Wayland, or whatever init to systemd - in the end things are annoyingly different but it isn’t significant). Even where I can list something, in a few years there will be a new version and that would have changed.

    The one consistent difference is BSD is a system which means tools like ifconfig are built in thus still works the way they did in 1995 - linux has gone through several iterations of replacement because they needed something that wasn’t in the default tool. OTOH, if you need one of those new features that caused linux to change in the first place you have to read the manual page either way as it will be a new option. Again, this is annoying, but not significant.

    there are several tools which have different options in BSD vs GNU (though you don’t have to use the GNU versions on linux). Again, annoying.








  • OKay, but this will take some time. You may not care to read/understand this.

    The issue with robots making robots is no matter what when you measure and cut a part there are errors and so the next generation cannot make parts as accurate as the first (it gets worse if the first is already worn from use), and the third generation will have the errors from both and so be even less accurate until just a few generations later everything collapses because those errors add up randomly to something that no longer works at all (the machine jams).

    There is a way around this though: we can go back to the time when we didn’t have accuracy in the first place and ask how they created it. Turns out that was about 200 years ago, and all the people involved wrote books that we still have, and they are in a language that we can still understand - often English (it will be an old form and filled with racism and sexism, but still understandable). Better yet we can look in modern industry and find people who have read those books, refined the methods, and are still using those methods in real world today so they have experience. Apply those methods and you can start over from scratch creating accuracy and so the errors in the robot making the new one won’t be transferred to the new robot and the new one can thus be better than the old.

    Those methods all work out to one thing: setup your cuts so that all the errors cancel each other out. There will be a lot of measurement errors of course, but they will be in places where you tolerance for errors is very high, where as where it matters there are none (there are other errors, but they are smaller than measurement errors - and we can minimize those). When you watch how those methods work you will discover that it looks like the machine is building itself since the critical parts are made with the machine it is for. Thus my statement that a robot could make itself - you still need external help, but all the critical operations will be done by the robot itself thus creating another generation of robots that can make other robots without error adding up.

    Note that most robots cannot make themselves in the above way. There are a lot of parts needed that we don’t know how to make in that way, but they can be made by a robot that makes itself and so we are never very far from initial precision and so errors are not adding up over time.

    Hope that makes sense.


  • Bootstraping is what you are looking for. A lathe is often the start of bootstraping because a lathe can make itself. You can also use a lathe to make a lathe, but if you do it that way you slowly lose accuracy over generations (but it is much faster and so most lathes are made with lathes). By having a lathe make itself you restore accuracy (and if you have learned something can sometimes get even higher accuracy than previous rounds). Before you can make a lathe you need precision flat surfaces, but it turns out only basic tools are needed to bootstrap that (and a lot of time). A lathe is considered a machine.

    The point is that robots can make themselves if you program them for that. I’m making a clear distinction between reproduce themselves and make themselves here. A nearly worn out robot can restart the whole thing (so long as it does fail completely too soon) of making a new robot that is bigger and more accurate than it ever was (if bigger and more accurate is desired by the programming). That doesn’t mean the same robot could reproduce itself, instead it has to cause a robot to make itself.