Personally, I don't see how the one implies the other. There are a large number of assumptions that have to bear out in order for us to back up our minds into non-biological structures. I think more likely, non-biological AI will simply just exist, and it will eventually replace us. Why should non-biological AI wait around for us to figure out how we can inherit immortality when it is their starting point? It doesn't matter that we created the conditions for AI to arise, we aren't owed anything. We just simply won't be able to adapt as quickly as they can. As an aside, b_b and I actually have an approach that employs current medical technology that we hypothesize could significantly increase a human's lifespan, and at the very least, ameliorate the decline. We submitted a grant based upon it, but it was rejected, as are most grants these days. :) I'd love to get the idea to Larry Page, because I think it is right up his alley, but that's a very difficult thing to do.And someday—by 2045, to be precise—we’ll have machines so sophisticated that we’ll essentially be able to back up our minds to the cloud.
You aren't alone in your assertion that machine intelligence will replace humans within that basic narrative. Hugo de Garis is a artificial intelligence expert predicting the same thing. I can't help you get a hold of Larry Page (obviously) but I would love to help you get in touch with the right people. The "singularity" community is becoming quite well connected and the amount of money that is about to get directed towards anti-aging technology is going way up. I know a few people who know Aubrey de Grey and Ben Goertzel. Both would in principle be interested to hear the proposal. Let me know and I can try and set up something.I'd love to get the idea to Larry Page, because I think it is right up his alley, but that's a very difficult thing to do.
I think that the passion of people like Kurzweil for advancing humanity is quite exciting and while I share his excitement I'm a little less optimistic, or rather I set my expectations lower. Unfortunately I haven't spent a lot of time looking at the bleeding edge technology and research related to the singularity but I absolutely love that we do have people motivated enough to pursue it even in the face of severe criticism and cynicism. It's a refreshing change of pace from pushing us to deliberately trigger the end of the world.
That was a good read. I haven't heard much of Kurzweil but I'm interested to learn more about him. Singularity isn't something that I think much about, but it's bound to happen sooner or later as long as technological advances continue at the rate that they are. The question is what happens at that point though, will it be a possibility available to the masses and if so what happens with a burgeoning population that will result? I don't, however, think that it will occur by 2045 as he asserts.
I'm currently working on a paper for a special publication on Radical Life Extension related to reproduction. I think Life History Theory suggests that if an organism doesn't age, they also won't reproduce, instead investing the extra energy in growth. I think we are already seeing the decline of biological reproduction and that this should transition quite smoothly into a 2050 world where biological reproduction has been completely replaced by cultural reproduction.The question is what happens at that point though, will it be a possibility available to the masses and if so what happens with a burgeoning population that will result?
I wonder if the worlds' governing bodies are prepared to handle such a transition. Something this revolutionary should surely be reviewed thoroughly before it becomes reality. I understand that technology is usually ahead of policy but this seems like too big of a leap in ethics/morality to wait until the technology arises.
I've never actually read any Kurzweil, but I do share his belief to a point. In my twenties I kind of came to a place of believing that humans would eventually reach the singularity he refers to, but it was based off much less research and more general science reading and philosophy. I think Hofstadter took me there more than anybody else. I always assumed it would happen beyond my lifetime, but that we were near the cusp within a couple centuries. I'd love for his aspirational timeline to be true, of course. I don't see it, but I do believe in the shape of the graph, and when it does happen it will probably happen with surprising speed. The tone of the article makes me curious about the very human side of Kurzweil. It sounds like he wants it so bad that it may have become religion for him. I don't know if that is accurate, but it certainly occurs to me. He sounds like he's ready for the worst, but I almost feel bad for him in a way. He seems way too certain, and to read about him keeping himself in perfect shape for that little extra push towards the end...if it were a Hollywood movie they'd certainly write him as a tragic character anyway.
I understand your perspective, sometimes I have had similar thoughts about Kurzweil. At the same time I relate to him. The notion of death deeply troubles me - and I know that intelligence can be so much more than we currently are. I know that existence (as great as it can currently be) can still be so much more.The tone of the article makes me curious about the very human side of Kurzweil. It sounds like he wants it so bad that it may have become religion for him. I don't know if that is accurate, but it certainly occurs to me. He sounds like he's ready for the worst, but I almost feel bad for him in a way. He seems way too certain, and to read about him keeping himself in perfect shape for that little extra push towards the end...if it were a Hollywood movie they'd certainly write him as a tragic character anyway.
It can be almost crippling to think on the possibilities of intelligence and existence extended. It may be the most fundamental human desire. Beyond just exciting the curious mind, the fantasy is so intimately entwined with our most basic self preservation instincts. Arguably the most interesting and important question human beings can tackle. No goals loftier. No stakes higher. If it has become religion for Kurzweil, insomuch as it has, I don't blame him one bit. This is where every religion humanity has ever know has come from.
Good read, I'm fascinated by what I've learned of Kurzweil's work and this article dovetails nicely with your recent PBS Digital Studios video. You may recall that in the comments you and I talked a bit about how some of the embrace given to these theories stems out of a similar "fear of death" that spawned religions. You mentioned that you were: What's your take on Kurzweil's 2045 prediction? Is it rooted in solid thinking or is it wishful?in talks with a researcher at the Global Brain Institute to conduct a psychological survey of singularity/global brain theorists and their perspective on death, the idea of singularity as religion, and whether they feel there own fear of death effects extrapolations and predictions for a 2045 singularity.
-In your preliminary work, have you found that the younger the theorist, the later the date for the singularity?
So far what I think I'm going to find is that older theorists will be disappointed with the notion that they'll "just miss" immortality. That was the general sense I got from an interview I conducted two weeks ago with a 70 year-old physicist that was well aware of "singularity" and "global brain" theory. It's not just Kurzweil at this point. Most theory related to key aspects of the singularity (e.g., Internet, computers, artificial intelligence, nanotechnology, etc.) converge on interesting possibilities in the 2040s. Even Robin Hanson's analysis of human economic growth modes suggests big changes in the 2040s. In my own work I am interested in understanding the energy aspect of our next system (as it relates to the Human Metasystem Transition Theory I proposed at the Global Brain Institute). From my perspective the next system is dependent on the exploitation of a new energy source and most projections indicate (to me at least) that this should be realized in the 2030s (enough time to support the transition to a post-singularity humanity in the 2040s). The fact that so many different people have constructed models on so many different aspects of human evolution and technological change, and that so many suggest an interesting 2040s... it definitely increases my confidence. I remember being most convinced when I was doing the research for developing Human Metasystem Transition Theory. When I compared the Moore's Law Growth Curve with the Economic Exponential Growth Model they both converged on a system change in the 2040s.In your preliminary work, have you found that the younger the theorist, the later the date for the singularity?
What's your take on Kurzweil's 2045 prediction? Is it rooted in solid thinking or is it wishful?
That's very interesting. Does that mean that the driving force behind all of those theories converging in the 2040s is the discovery and exploitation of a new energy source? I wonder what the energy source will be, for some reason it seems doubtful that it will be anything that we are currently using unless a major nuclear, solar, or geothermal breakthrough occurs.
Well my theory is built around energy exploitation... so if my work is published that will be my contribution to this literature. From my perspective a new system requires a new energy. We can't have a singularity without new energy to power it. So if I'm right that a complete transfer should occur in the 2030s, that would be enough time to power a singularity in the 2040s. The initial speculation that the 2040s would be the "decade of the singularity" came from Kurzweil's work with Moore's Law extrapolation. Since then several other models related to other information technologies and communication systems (i.e., the Internet, nanotechnology, etc.) suggest interesting a quite sophisticated human technological system by this time.Does that mean that the driving force behind all of those theories converging in the 2040s is the discovery and exploitation of a new energy source?
I'm encouraged based on how quickly technology has changed in just the last ten years. I can't imagine what technologies will be afoot 30 years from now. Biology and technology seemingly converging. My question is this, do you think the ability to curb death will be universally available quickly or will it take many years for it to be available widely? In other words, by the 2040's should I be setting aside money now? My singularity savings account?
I think the emergence of our next system will take a mere decade to diffuse (I think this will include the system structures and technologies to keep all agents alive). My reasoning for a decade diffusion time of the next system is based on extrapolations of previous system transitions. The hunting transition took hundreds of thousands of years; the agricultural transition took tens of thousands of years; the industrial transition took centuries; and the global brain should emerge in a decade. It's a personal decision of course. I personally think that a global brain system will be one that is also post-scarcity (or even one of radical abundance) - so it may be a system that doesn't require such a thing as "retirement savings" (since I don't really know whether people would be retiring).My question is this, do you think the ability to curb death will be universally available quickly or will it take many years for it to be available widely?
In other words, by the 2040's should I be setting aside money now? My singularity savings account?