听写填空,只写填空内容,不抄全文,5-10句,不用写标号,注意标点,口语中因结巴等问题造成的重复单词只写一遍~

Nick Bostrom is director of the Future of Humanity Institute at Oxford University. [---1---] Those are potential events that could threaten your existence and mine, and the whole human species.

Nick Bostrom: Nuclear proliferation, biological weapons, pandemic disease, despotism, various forms of social and economic collapse scenarios rank high on the list of near to mid-term potential catastrophes.

[---2,3---]

Nick Bostrom: [---4---]

[---5---]

Nick Bostrom: [---6,7---]

If you're concerned about surviving a global catastrophe, Bostrom's best advice is to stay healthy. After all, it's still survival of the fittest.

I'm Lindsay Patterson from ES, a clear voice for science. We're at Es. Org.

【视听版科学小组荣誉出品】
He's organized a 2008 conference for scientists to gather to discuss catastrophic risks. Humans have experienced global catastrophes in the past, Bostrom said. But modern technology has brought new potential risks. We have risks that arise from powerful new technologies that we might develop, such as advanced nanotechnology and superintelligent machines. He said the events of this century could determine the survival of our species. This critical transition period might pose the biggest existential risk for humanity that we've ever faced, because we are developing very powerful technologies that we have no experience with. And it's unclear at this point whether we have the wisdom to use these technologies to our advantage rather than to our destruction.