For millennia, since the beginning of humanity even, we have told and shared stories. Stories of tyrant kings and bold warriors, filled with morals and emotions. Scribes and poets have taken those stories and made them permanent for us. They have given them endings. The stories give our lives and experiences meaning, something to relate with and learn from.
But who will finish our story? Who will write the last chapter of human history when we've all gone away? How can we be sure that we will even get an ending, let alone a "worthy" one? Will the story of us humans become something? Or maybe nothing?
Depressing, but true. Our very existence as a species is such a fluke. So many things had to occur just right for us to even be here, and that less than a second on a cosmic scale. We are a grain of sand in the Sahara. Now you can interpret this in two ways: Nothing really matters because we may never get off this rock, probably destroying ourselves long before the Sun explodes or... It's beautiful because there is no meaning except to live our lives and love each other and celebrate the mindfuck that is our amazing luck to be here in the first place. I got to get back to work. Have a great day everybody.
Robots. Hear me out. It's been said that artificial intelligence will be the last invention that humans will ever have to make. Imagine this: if the first generation of AI is as smart as a human, then the second generation could be exponentially more intelligent than a human, etc. A few generations later we would be mere ants to AI and we could very well be eliminated from the planet. At that point, the only thing left to right the final chapter in history books about us (assuming the AI even wants to to that) would be robots.
I do no believe it is likely that a technological singularity would wipe us out. Of course, we'll have to be cautious in the way we develop these systems, but at the end of the day we would not give them full reign to do as they please. For example, let's look to the autopilot of a commercial airliner. They are usually designed with "with redundancy and reliability as foremost considerations." In some cases, one aircraft with have multiple different autopilot systems. Each system will have been designed by a different team, potentially in a different language or a different architecture, and will run independently. If the majority of these systems at any time don't agree on the next step they should take, those controls handed back to the pilot. Of course, these are artificially intelligent systems in the way you're hypothesizing about. But I'd hope that there's no way super-intelligent AI would be developed without similar considerations being put in place. It would very silly indeed just to let such a system go off and do as it pleases. Anyway, have you ever tried to get rid of ants? Those bastards are resilient.
I think that if we outlast the first stages of true AI then we'll get to a point where our technology will allow us to fuse with the AI. It's a matter of not giving AI the access or capabilities to develope in areas that could threaten humans. That's why it's so important to try and find the best ethical system because when it comes time to program these AIs what are we going to tell it to value above all else? Progress? Human wellbeing?
Note to self: Be nice to AI. Got it. ;)
Do you think, knowing what you know of the human race, that we could be a "creator" type of story for them, something for them to hold onto, with "hope" or maybe even "reverence"? Or would we be a more sinister villian-esque "this is what they put us through" kind of thing? I understand that they could be simply too analytical to know of hope or disdain (depending on the level of AI we're talking about), but if we made them then they must retain some of our inherently human characteristics?
I would love to say yes, but I imagine we won't make AI "in our image and likeness" to the point where they will have an internal disposition to worship or at the very least value highly a "creator" figure. If AI will exist, I imagine it would be strictly logical with the a given value (such as happiness or maybe progress) to maximize. What I hope the future holds is a mix of cold hard logic, and abstract emotion. And with this mix, I want the human consciousness and artificial intelligence to mix together. One in the same. Nice to think about, no?
Man got to sit and wonder 'why, why, why?' Tiger got to sleep, bird got to land; Man got to tell himself he understand.Tiger got to hunt, bird got to fly;
Well.. the last human on earth will most probably know little about the grand history of their race. It will perhaps be buried under the sand and dust of time. They will go on adventures, vanquish fell beasts in terrible dungeons and live in blissful ignorance, unaware of the gravity of their fleeting existence. They won't be finishing our story; They will be starting, and completing their own story. Oblivious to the past anxieties of their ancestors; peaceful in the void.
What if by pitting ourselves against future machines, potentially greater in intelligence and discipline and will; what if by imagining our story could somehow be overwritten; what if by hoping for an everlasting legacy (or life), we miss a greater story, and a more essential truth? Does "synthetic" or "artificial" have any real meaning? I figure there's no out-of-this-world, no matter how out-of-this-world. Technically speaking. We have progressively separated ourselves from nature and this is its continuation. It took us quite a long time to evolve the value systems we have today (however feeble they can be), but technology is advancing at an exponential rate. We only recognize local, linear advancement. Peter Diamandis (Singularity University) explains it this way: you can imagine yourself walking 30 steps or 30 meters ahead (linear), but 30 exponential meters gets you 26 times around the planet. When we think of AI advancing, we can kind of imagine cognition that is well beyond our current capacity (maybe IBM's Watson ), but I think that's a fairly flat formulation. Any sufficiently advanced AI would need to be social in order to fit the description. That means it would need to be PRO-social, i.e., have emotions. Why wouldn't it be more human than we are? Us-plus? I could argue that we're collectively evolving; it's possible AI is just what's next on the wacky roadmap that began at atom and moved (very roughly, skipping steps) to molecule to single celled life to fish to bird to mammal to primate to us ---> ?. We're just a chapter. - julie
Julie, yes we are evolving more rapidly than ever. The Internet has been part of our lives for a mere 20 years. In the next 2 years another 3 billion people will be coming online from the third world, including those same people who have been taking all the MIT courses offered for free on the Internet. It's easy to imagine a few thousand Marie Curies, Luies Pasteurs, Albert Einsteins, Maya Angelous, about to shake our world! Technology is now intertwined with our biology. There is much to hope for just as there is much to fear. It's hard if not impossible to see what shape we will be in the future. I think of my children. As they grow it is easy to look at baby pictures and see where they came from, but I can never imagine adequately what they will become. Their physiology is easy to see and remark upon and illustrates this well. But, even more daunting is how their intelligence and behavior evolves. I an in awe of their abilities, so much more than my own at their age. And this is reflected in more than what they say and do - it's in society. For example, I am a math major, but my son has been studying as far back as Jr High many of the concepts I learned only in college. Mythically, our story is in a great transformational age. Yes, it will be a rough ride ahead for many, perhaps us included. Nonetheless, it is the only ride there is! Let's step up, lead not follow, raise our bar, make a new day. Carpe diem!
A few years ago I sat next to a baby on a plane. She reached for my tablet and began swiping right to left—an infant. I looked around and all the babies were swiping. For all I knew, they were e-filing their own taxes. If AI are as cute and adaptive (and emotionally unstable) as babies, we are in trouble. Otherwise, I hope I'm alive to see the machines our babies create.
I personally think the worst that could happen to humanity is if everyone devolves into grey aliens and the best thing that could happen is that we find ways of shaping the world as easily as Minecraft biomes when things like biotech and fabrication become much cheaper.
My Biology teacher in highschool proposed us a theory that said we would evolve into The Greys. My teenage mind was blown, and I still think about it these days. But he also believed in the "Ancient Astronauts" malarkey, so I can't really account for his sanity. :p
I know that this comment will sound flip and dismissive. I truly don't mean it that way. But: "Who will finish our story?" I ask, who cares? __ I believe that we are made up of reactions. Biology, chemistry, electricity - we are powered by atoms, electrons, neutrons, basic impulses. I do not really believe that any part of us continues on after our body, our housing, wears out. By my logic it can't - if our consciousness really is as simple as a product of what our body does, of synapses firing and nerves responding, then there is no reason, in fact no way, that it could continue beyond the mechanisms that create it. I find this reassuring. Not all people do. I don't know if it's "most" or "some," I just know that many people hear this and say "That's depressing - to think that once our body dies, that is it," (flick your fingers for dramatic flair, perhaps) "our existence ceases." Many people care about legacy, progeny, history, memory - about whether they will last longer than their current iteration of consciousness in their current body, or not. That's okay. I understand that you may want what I don't want, and so on. I am not offended. It is okay, even reasonable, for you to want some things that I do not. A lot of people want to feel that their death is not their end. I get that they want it and can even see why. But it's not what I believe, it's not what makes me feel at peace, it's not a belief I find supported, sensical, or, for me, positive. __ I don't care who will finish 'our story' - humanity's, I believe is what we are driving at. The story of humanity and how it ended. I tell you what, if anyone finishes it, it won't be a human. By definition it can't. I don't care what our story is. I have little to no control over that. And why be arsed to care about people living millenia in the future that I will never know, that may be complete aliens in fact? Why bother caring what the future may think of us? I cannot control billions. I care that if anyone wants to finish my story and tell it to others, that they can tell a good, respectable, honorable, but humane, passionate, and fulfilled, one. I care that my story, such as it is or can be that, is the best one that I can live, and follows my ideals, morals and beliefs as consistently as I can manage. I care that if anyone speaks of me after my death they have more good to offer up than bad. I cannot control who tells 'my story,' or 'humanity's story.' I cannot control their lens, and sometimes people will paint you as a villain no matter what. If someone were to do that I could not stop them. But what would satisfy me would be if I knew their complaints were spurious, false, or otherwise wrong. And I can only control the truth of what they say about me by living it well, by being a good person. If I want our story to be a good one I have to live as well as possible, and because of my beliefs, I feel I have only this one lifetime to do it. It is not, " who will finish our story?" but "would it be a good one?" that I care about. And I can only control the small, small part of that story that would be mine, if ever anyone cared enough to spin humanity into one. I can't control the story of humanity and if I make even an impact on it I will be a lucky one. I can only control my life, and my part of it.
I feel the same, in a sense, as you do. I think we get one shot at things, and that's okay... But where we differ is the fact that I wish I could believe in something after it all. I want there to be something.. more? Better? I don't quite know. But I wish I could just believe like I see some people believe. Good point on which question I should be asking. Will it be a good one? Maybe, most likely, but I guess we'll never know for sure.
Why not? We did things. Great (and terrible) things. We will continue to do more things and I think that deserves an ending? Who gives that to us, and when, are up in the air certainly.. But I certainly care. It does matter.
You don't exactly survive by millions of people, because when you die you lose all your memories, experiences , thoughts and feelings. You won't be able to feel anything ever again, and it doesn't matter whether or not others might or might not remember you because you will not.
Technically, if you reproduce, your genetic memory will live on as long as your bloodline. I think it was Aristotle or Socrates that stated that every man seek immortality. A common man would go about this by having children an making sure his next generation could follow suit, but this was futile and at somepoint this man would be forgotten and lost in time. A great man would live on though his ideas an virtues. Also, how do you know what happens when you die?
If you have children they are not you, you may some of the same genes but you do not control then like you control your body, so they are not you. It still doesn't matter if your genetic memory will survive because you as a person will not. I do not know what happens after death but all the theories about after life or reincarnation are just ideas that just some people made, without any evidence. I think it is more likely to not feel anything like before you were born, rather than an idea that people made.
Children are exactly half of you. So they are definitely part you. An genetic memory is just as valid as your temporary ones except it last longer than you or your body.
I believe, if we were to collectively stop caring, we would be essentially be admitting defeat. Against what? Everything I suppose. Being human, some might say, is synonymous with caring... Otherwise, how have we made it to this point?