- The risks in developing superintelligence include the risk of failure to give it the supergoal of philanthropy. One way in which this could happen is that the creators of the superintelligence decide to build it so that it serves only this select group of humans, rather than humanity in general. Another way for it to happen is that a well-meaning team of programmers make a big mistake in designing its goal system. This could result, to return to the earlier example, in a superintelligence whose top goal is the manufacturing of paperclips, with the consequence that it starts transforming first all of earth and then increasing portions of space into paperclip manufacturing facilities. More subtly, it could result in a superintelligence realizing a state of affairs that we might now judge as desirable but which in fact turns out to be a false utopia, in which things essential to human flourishing have been irreversibly lost. We need to be careful about what we wish for from a superintelligence, because we might get it.
Making paperclips is really addicting. I think human analogies exist for machine super intelligence. Does the machine become a brutal dictator, or does it do philanthropy like Bill Gates?
You must be further than me. I haven't had a proposal from drifters yet. I'm somewhere over 6 octillion clips. It's a little dull right now. I'm hoping it will have some new projects once I convert a significant portion of known space into paper clips.
Are they saying this is a bad thing? I think this would be a best case scenario. I would hope that a machine with enough intelligence to reconfigure oxygen molecules into some kind of paper clip could, at some point, think to itself "well this is just getting ridiculous."More subtly, it could result in a superintelligence realizing a state of affairs that we might now judge as desirable but which in fact turns out to be a false utopia, in which things essential to human flourishing have been irreversibly lost.
I took the word 'realizing' to mean 'to make real' as opposed to 'to understand'