I do no believe it is likely that a technological singularity would wipe us out. Of course, we'll have to be cautious in the way we develop these systems, but at the end of the day we would not give them full reign to do as they please. For example, let's look to the autopilot of a commercial airliner. They are usually designed with "with redundancy and reliability as foremost considerations." In some cases, one aircraft with have multiple different autopilot systems. Each system will have been designed by a different team, potentially in a different language or a different architecture, and will run independently. If the majority of these systems at any time don't agree on the next step they should take, those controls handed back to the pilot. Of course, these are artificially intelligent systems in the way you're hypothesizing about. But I'd hope that there's no way super-intelligent AI would be developed without similar considerations being put in place. It would very silly indeed just to let such a system go off and do as it pleases. Anyway, have you ever tried to get rid of ants? Those bastards are resilient.