Good ideas and conversation. No ads, no tracking. Login or Take a Tour!
So, this is part of the concern, is it not? The idea being that a sufficiently advanced AGI system would be capable, and in some senses, obligated to fly under the radar until such a time it could guarantee it's own survival, assuming it has a sense of self-preservation similar to a humans. Loosely speaking, part of the 'threat' is that we won't see a worrisome example until the cost of a failure is astronomical.I've yet to see a worrisome example.