
Professor Stephen Hawking has issued a stark warning recently, saying that he believes within the next 100 years human civilization faces extinction, thanks to man-made threats.
The Astrophysicist warned users taking part in a Q+A on Reddit that the rise of artificial intelligence, nuclear bombs, and aliens could see the entire human race wiped from the face of the earth.

BYPASS THE CENSORS
Sign up to get unfiltered news delivered straight to your inbox.
You can unsubscribe any time. By subscribing you agree to our Terms of Use
Anonhq.com reports:
While Hawking believes technology can ensure mankind’s survival, he simultaneously warns further developing AI could prove a fatal mistake. In a lengthy Q&A session on Reddit, Hawking explained how AI is humanity’s biggest existential threat:
“The real risk with AI isn’t malice but competence. A super intelligent AI will be extremely good at accomplishing its goals, and if those goals aren’t aligned with ours, we’re in trouble.
Artificial Intelligence is the worst mistake we have ever made – Stephen Hawking
— Helen (@h__alberto) June 15, 2016
“You’re probably not an evil ant-hater who steps on ants out of malice, but if you’re in charge of a hydroelectric green energy project and there’s an anthill in the region to be flooded, too bad for the ants. Let’s not place humanity in the position of those ants.
“If they [artificially intelligent machines] become clever, then we may face an intelligence explosion, as machines develop the ability to engineer themselves to be far more intelligent. That might eventually result in machines whose intelligence exceeds ours by more than ours exceeds that of snails.”
You’re probably not an evil ant-hater who steps on ants out of malice, but if you’re in charge of a hydroelectric green energy project and there’s an anthill in the region to be flooded, too bad for the ants. Let’s not place humanity in the position of those ants.
“If they [artificially intelligent machines] become clever, then we may face an intelligence explosion, as machines develop the ability to engineer themselves to be far more intelligent. That might eventually result in machines whose intelligence exceeds ours by more than ours exceeds that of snails.”
https://twitter.com/GrandFathrTiddy/status/695281066505011200
The Independent quoted the scientist as saying:
“The human failing I would most like to correct is aggression. It may have had survival advantage in caveman days, to get more food, territory or a partner with whom to reproduce, but now it threatens to destroy us all.”
Intelligence and aggression have the capacity to destroy us, but what does this mean for humanity’s chances of getting destroyed by aliens? Since past few years, Hawking has warned that if an intelligent, more advanced alien civilization exists, it would not be friendly to mistreating, less technologically advanced humans, and would have no problem in conquering and colonizing the planet and eventually wiping out the human race. In April 2010, Hawking noted:
“If aliens ever visit us, I think the outcome would be much as when Christopher Columbus first landed in America, which didn’t turn out very well for the Native Americans. Such advanced aliens would perhaps become nomads, looking to conquer and colonize whatever planets they could reach. If so, it makes sense for them to exploit each new planet for material to build more spaceships so they could move on. Who knows what the limits would be?”
“If the government is covering up knowledge of aliens, they are doing a better job of it than they do at anything else.”
― Stephen Hawking— Edward Cowling (@gnilwoce) June 11, 2016
During a media event at the Royal Society in London in July 2015, Hawking voiced his fears again:
“We don’t know much about aliens, but we know about humans. If you look at history, contact between humans and less intelligent organisms have often been disastrous from their point of view, and encounters between civilizations with advanced versus primitive technologies have gone badly for the less advanced. A civilization reading one of our messages could be billions of years ahead of us. If so, they will be vastly more powerful, and may not see us as any more valuable than we see bacteria.”
Latest posts by Sean Adl-Tabatabai (see all)
- World Athletics Bans Biological Males From Competing Against Women - March 24, 2023
- Ray Epps Demands That Tucker Carlson Apologise for Ousting Him as a Deep State Goon - March 24, 2023
- Democrats To Create Sanctuary State for Kids To Get Abortions and Sex Changes Without Parental Consent - March 24, 2023