a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment by OftenBen
OftenBen  ·  2630 days ago  ·  link  ·    ·  parent  ·  post: The Exact Opposite Is True

    I've yet to see a worrisome example.

So, this is part of the concern, is it not? The idea being that a sufficiently advanced AGI system would be capable, and in some senses, obligated to fly under the radar until such a time it could guarantee it's own survival, assuming it has a sense of self-preservation similar to a humans. Loosely speaking, part of the 'threat' is that we won't see a worrisome example until the cost of a failure is astronomical.





mk  ·  2630 days ago  ·  link  ·  

Sure, but IMO that misunderstands the nature of intelligence. Intelligence is sense and interaction. Without such interaction, there is no comprehension in regards to the environment in question. A model-less intelligence is like a boneless elephant.