Professor Stephen Hawking believes the future of the human race depends on our abilities to explore space.
During a tour of London’s Science Museum, the 73-year-old said that landing on the moon gave us new perspectives of life on Earth, and this outlook must develop if we are to survive.
He also said aggression should be weeded out of the human race and replaced by empathy to avoid a major nuclear war ending civilization as we know it.
Professor Hawking made the comments while escorting an American visitor around the museum as part of a ‘Guest of Honor’ prize.
Adaeze Uyanwah, 24, from Palmdale, California, won the tour after producing a blog and video describing a ‘perfect day’ in the UK capital.
She asked Professor Hawking what human shortcomings he would alter, and which virtues he would enhance if this was possible.
‘The human failing I would most like to correct is aggression. It may have had survival advantage in caveman days, to get more food, territory or partner with whom to reproduce, but now it threatens to destroy us all.
‘The quality I would most like to magnify is empathy. It brings us together in a peaceful, loving state.’
The professor added that human space exploration was ‘life insurance’ for the human race and must continue.
‘Sending humans to the moon changed the future of the human race in ways that we don’t yet understand,’ he said.
‘It hasn’t solved any of our immediate problems on planet Earth, but it has given us new perspectives on them and caused us to look both outward and inward.
‘I believe that the long term future of the human race must be space and that it represents an important life insurance for our future survival, as it could prevent the disappearance of humanity by colonizing other planets.’
Ms Uyanwah, a teacher and creative writer, who beat more than 10,000 international contestants to win the prize, said: ‘It’s incredible to think that decades from now, when my grandchildren are learning Stephen Hawking’s theories in science class, I’ll be able to tell them I had a personal meeting with him and heard his views first hand. It’s something I’ll never forget.’
The Guest of Honor competition was organised by VisitLondon.com.
In November, Elon Musk, the entrepreneur behind Space-X and Tesla, warned that the risk of ‘something seriously dangerous happening’ as a result of machines with artificial intelligence, could be in as few as five years.
He has previously linked the development of autonomous, thinking machines, to ‘summoning the demon’.
Speaking at MIT’s AeroAstro Centennial Symposium in October, Musk described artificial intelligence as our ‘biggest existential threat’.
He said: ‘I think we should be very careful about artificial intelligence.
If I had to guess at what our biggest existential threat is, it’s probably that.
So we need to be very careful with artificial intelligence.
‘I’m increasingly inclined to think that there should be some regulatory oversight, maybe at the national and international level, just to make sure that we don’t do something very foolish.
‘With artificial intelligence we’re summoning the demon.
You know those stories where there’s the guy with the pentagram, and the holy water, and … he’s sure he can control the demon? Doesn’t work out.’
In December 2014, Professor Hawking issued another warning – that artificial intelligence could spell the end of the human race.
Speaking at event in London, the physicist told the BBC that: ‘The development of full artificial intelligence could spell the end of the human race.’
And in January, a group of scientists and entrepreneurs, including Elon Musk and Professor Hawking, signed an open letter promising to ensure AI research benefits humanity.
The letter warned that without safeguards on intelligent machines, mankind could be heading for a dark future.
The document, drafted by the Future of Life Institute, said scientists should seek to head off risks that could wipe out mankind.
And the authors said there is a ‘broad consensus‘ that AI research is making good progress and would have a growing impact on society.