Hoffman Amplifiers Tube Amplifier Forum
Other Stuff => Other Topics => Topic started by: TNblueshawk on March 24, 2015, 09:48:38 am
-
I've worried about this for several years but I'm not very smart and never been accused of being Stephen Hawking or anyone else with a high IQ :laugh: Maybe I've watched too many movies. At almost 52 I probably won't have to worry about it much assuming I make it into my 70's or longer but technology won't go backwards (tube amps aside :icon_biggrin: ) so who knows.
Anyone else worried?
Steve Wozniak: The Future of AI Is 'Scary and Very Bad for People'
We should all be getting a litttttle nervous: The robot apocalypse is brewing.
Or at least, that's what a growing number of tech visionaries are predicting. In an interview (http://www.afr.com/technology/apple-cofounder-steve-wozniak-on-the-apple-watch-electric-cars-and-the-surpassing-of-humanity-20150323-1m3xxk) with the The Australian Financial Review, Apple co-founder and programming whiz Steve Wozniak added his own grave predictions about artificial intelligence's detrimental impact on the future of humanity to warnings from the likes of Elon Musk, Bill Gates and Stephen Hawking.
"Computers are going to take over from humans, no question," he told the outlet. Recent technological advancements have convinced him that writer Raymond Kurzweil – who believes machine intelligence will surpass human intelligence within the next few decades – is onto something.
"Like people including Stephen Hawking and Elon Musk have predicted, I agree that the future is scary and very bad for people," he said. "If we build these devices to take care of everything for us, eventually they'll think faster than us and they'll get rid of the slow humans to run companies more efficiently."
Related: Bill Gates Is Skeptical of AI. But After This Little Robot Left *Me* a Personal Love Letter, I Can't Help But Disagree. (http://www.entrepreneur.com/article/242431)
Musk, the CEO of Tesla, has been the most vocal about his concerns about AI, calling it the "biggest existential threat" to mankind. He is an investor in DeepMind and Vicarious, two AI ventures, but “it’s not from the standpoint of actually trying to make any investment return," he said last summer (http://www.entrepreneur.com/article/234968). "I like to just keep an eye on what’s going on…nobody expects the Spanish Inquisition,” Musk said. “But you have to be careful.”
Meanwhile, in a Reddit 'Ask Me Anything' Bill Gates voiced similar reservations (http://www.entrepreneur.com/article/242431): "I agree with Elon Musk and some others on this and don't understand why some people are not concerned," he wrote. Similarly, physicist Stephen Hawking has warned (http://www.entrepreneur.com/article/240389) that AI could eventually "take off on its own." It's a scenario that doesn't bode well for our future as a species: "Humans, who are limited by slow biological evolution, couldn't compete, and would be superseded," he said.
Worried yet? Wozniak is.
"Will we be the gods? Will we be the family pets? Or will we be ants that get stepped on? I don't know about that …"
-
If computers learn to be evil and cruel then yes, we should worry.
Right now, the only evil and cruel entities are the designers at Microsoft, that won't leave Win7 well enough alone! :icon_biggrin:
Oh, and the smartphone, who is giving 80% of the general public a horrible posture.
-
Time to re-read Asimov. And join the resistance!
-
I'm too old to resist. I plan to just sneak off to an island with flip flops and a wife beater hoping no one will get around to that island for a long time. I just have to figure out what island. I'll make my own beer to subsist too.
I have to admit I had to Google Asmiov and well let me reiterate...I've never been confused with a Mensa candidate :l2:
John, if humans are doing the programming I can assure you they will be evil and mean as we will make sure of it :sad:
-
John, if humans are doing the programming I can assure you they will be evil and mean as we will make sure of it
But there will be the Three Laws... until there is the Fourth!
-
John, if humans are doing the programming I can assure you they will be evil and mean as we will make sure of it
But there will be the Three Laws... until there is the Fourth!
So true! But, I never underestimate our ability to destroy things including ourselves unfortunately.
-
I was thinking a bit more about this. Since our brain is really just a supercomputer, there's no reason why at some point "the machines" will become self aware enough to "think" for themselves. But they still won't have a soul.
Not unless they're tube powered.