by Eric Pfeiffer
May 2, 2013

from Yahoo!News Website


The crazy man walking down the city street holding a sign that reads “The end is near” might just have a point.


A team of mathematicians, philosophers and scientists at Oxford University’s Future of Humanity Institute say there is ever-increasing evidence that the human race’s reliance on technology could, in fact, lead to its demise.


The group has a forthcoming paper entitled “Existential Risk Prevention as the Most Important Task for Humanity,” arguing that we face a real risk to our own existence. And not a slow demise in some distant, theoretical future.


The end could come as soon as the next century.

"There is a great race on between humanity’s technological powers and our wisdom to use those powers well," institute director Nick Bostrom told MSN. "I’m worried that the former will pull too far ahead."

There’s something about the end of the world that we just can’t shake.


Even with the paranoia of 2012 Mayan prophecies behind us, people still remain fascinated by the potential for an extinction-level event. And popular culture is happy to indulge in our anxiety.


This year alone, two major comedy films are set to debut (“The World’s End” and “This is the End”), which take a humorous look at the end-of-the-world scenarios.


For its part, NASA released a series of answers in 2012 to frequently asked questions about the 'end of the world.'


Interestingly, Nick Bostrom writes that well-known threats, such as asteroids, supervolcanic eruptions and earthquakes are not likely to threaten humanity in the near future.


Even a nuclear explosion isn’t likely to wipe out the entire population; enough people could survive to rebuild society.

“Empirical impact distributions and scientific models suggest that the likelihood of extinction because of these kinds of risk is extremely small on a time scale of a century or so,” he writes.

Instead, it’s the unknown factors behind innovative technologies that Bostrom says pose the greatest risk going forward.


In other words,

...could become our own worst enemy, if they aren’t already, with Bostrom calling them,

“threats we have no track record of surviving."


"We are developing things that could go wrong in a profound way," Dr O'Heigeartaigh told the BBC in a recent interview.


"With any new powerful technology we should think very carefully about what we know - but it might be more important to know what we don't have certainty about."

However, it’s not all bad news.


Bostrom notes that while a lack of understanding surrounding new technology posts huge risks, it does not necessarily equate to our downfall.

“The Earth will remain habitable for at least another billion years. Civilization began only a few thousand years ago. If we do not destroy mankind, these few thousand years may be only a tiny fraction of the whole of civilized human history,” he writes.


“It turns out that the ultimate potential for Earth-originating intelligent life is literally astronomical.”












The End of Humanity is Near

-   This Time for Real   -
by Emma Pearse
May 1, 2013

from MSN-News Website




The end of humanity? A skeleton mask at a 'Day of the Dead' event in Mexico City.
A team of researchers say humanity may all be celebrating a massive "Day of the Dead" soon.



As one renowned physicist urges us all

to go into space to save ourselves,

others are actually agreeing that

humans may become extinct.




This isn’t media hype, says a team of deep-reaching philosophers (naturally), scientists and mathematicians at Oxford University’s Institute for the Future of Humanity, who are finding more and more evidence for this conclusion.

"There is a great race on between humanity’s technological powers and our wisdom to use those powers well," institute director Nick Bostrom said. "I’m worried that the former will pull too far ahead."

In a forthcoming paper, "Existential Risk Prevention as the Most Important Task for Humanity," Bostrom and his mission crew explore today’s risk of premature human extinction.

"Several key concepts and insights have recently started falling into place," Bostrom says. "They make it possible to look at big picture questions in a different way than before, and to start doing serious analysis on some of these questions."

This could be the final century for humanity, Bostrom told the BBC, because our advancement in technology has led us to a point in which technology is far more sophisticated than we are.


This is not news to science fiction authors and Hollywood directors - Isaac Asimov’s 1950 novel "I, Robot," Danny Boyle’s 2002 box-office hit, "28 Days Later," to name barely a few - who tend to think extremely yet presciently about a world in which our messing with technology could send us into extinction.


Now Bostrom and contemporaries are removing the fiction from the predictions and warning that we might be almost there.


Legendary physicist and cosmologist Stephen Hawking, who has been talking about our vulnerability for years, is still on a mission to wake us up.

"We won't survive another 1,000 years without escaping our fragile planet," Hawking told an audience in Los Angeles last week, according to the Los Angeles Times.


“We must continue to go into space for humanity.”

Hawking, who is on the board of the new Center for the Study of Existential Risk at Cambridge University, has always said that it’s our curiosity that might save us.

“If you understand how the universe operates, you control it in a way,” he told the LA audience.

Synthetic biology, nanotechnology and machine intelligence are areas that have contributed greatly to our quality of life, but also greatly threaten our future life.


We face nuclear destruction and, as we’ve been hearing for years, environmental Armageddon.

"With any new powerful technology, we should think very carefully about what we know," the Institute for the Future of Humanity's Dr. Sean O'Heigeartaigh told the BBC last week.


"But it might be more important to know what we don't have certainty about."

O'Heigeartaigh points out a conundrum in the Hawking/Carl Sagan theory that knowledge is power.


The more we investigate and experiment with technology, the lesser control we seem to have.

"We are developing things that could go profoundly wrong," he told BBC.

Can knowledge save us? Maybe. Maybe not.

"It's very hard to be completely certain about complex risks," O'Heigeartaigh told MSN News. "Some factors are hard to predict, and there's always a chance that we've missed something important or made an error."

What does this mean?


It means we might not even know how high the risk of premature extinction to humanity is.

"Estimates of 10-20 percent total existential risk in this century are fairly typical among those who have examined the issue," the Institute for the Future of Humanity paper states.


"The most reasonable estimate might be substantially higher or lower."

This uncertainty is precisely the motivation behind such research.

"This becomes especially important when we look at existential risks, many of which are high-impact, low-probability events," says O'Heigeartaigh.

That is, events that could determine whether or not we’ll be around in the next century.