...to kill human beings? This has been a big discussion as of late and surely mentioned elsewhere on the forum before. It's also been a massive interest in media such as books, television, and movies. As our technology progresses and as we develop AI technology, the once idea of fictional robots overthrowing humanity and essentially killing off human life is becoming a real concern. Already, machines are taking our jobs, so what's to say we won't design them to take over the rest of our lives? An article on Gizmodo talks about the potential.
What do YOU think will happen now that we are rapidly discovering new ways to improve our technologies?
source
While some scientists might roll their eyes at any mention of a Singularity, plenty of experts and technologists—like, say, Stephen Hawking and Elon Musk—have warned of the dangers AI could pose to our future. But while they might urge us to pursue our AI-related studies with caution, they're a bit less clear on what exactly it is we're being cautious against. Thankfully, others have happily filled in those gaps. Here are five of the more menacing destruction-by-singularity prophecies our brightest minds have warned against.
What do YOU think will happen now that we are rapidly discovering new ways to improve our technologies?
source