Nick Bostrom Quotes

On this page you can find the TOP of Nick Bostrom's best quotes! We hope you will find some sayings from Philosopher Nick Bostrom's in our collection, which will inspire you to new achievements! There are currently 20 quotes on this page collected since March 10, 1973! Share our collection of quotes with your friends on social media so that they can find something to inspire them!
All quotes by Nick Bostrom: more...
  • The Internet is a big boon to academic research. Gone are the days spent in dusty library stacks digging for journal articles. Many articles are available free to the public in open-access journal or as preprints on the authors' website.

    "Nick Bostrom on the Future, Transhumanism and the End of the World". Interview with Jonathan McCalmont, ieet.org. January 22, 2007.
  • Are you living in a computer simulation?

    Nick Bostrom, Milan M. Cirkovic (2011). “Global Catastrophic Risks”, p.141, Oxford University Press
  • When we are headed the wrong way, the last thing we need is progress.

    Progress   Lasts   Needs  
    "Perfection Is Not A Useful Concept". Interview with The European Magazine, www.theeuropean-magazine.com. June 13, 2011.
  • We would want the solution to the safety problem before somebody figures out the solution to the AI problem.

    Safety   Want   Problem  
  • The challenge presented by the prospect of superintelligence, and how we might best respond is quite possibly the most important and most daunting challenge humanity has ever faced. And-whether we succeed or fail-it is probably the last challenge we will ever face.

    Nick Bostrom (2014). “Superintelligence: Paths, Dangers, Strategies”, p.7, Oxford University Press (UK)
  • Our approach to existential risks cannot be one of trial-and-error. There is no opportunity to learn from errors. The reactive approach - see what happens, limit damages, and learn from experience - is unworkable. Rather, we must take a proactive approach. This requires foresight to anticipate new types of threats and a willingness to take decisive preventive action and to bear the costs (moral and economic) of such actions.

  • We should not be confident in our ability to keep a super-intelligent genie locked up in its bottle forever.

  • Far from being the smartest possible biological species, we are probably better thought of as the stupidest possible biological species capable of starting a technological civilization - a niche we filled because we got there first, not because we are in any sense optimally adapted to it.

    Nick Bostrom (2014). “Superintelligence: Paths, Dangers, Strategies”, p.53, OUP Oxford
  • Had Mother Nature been a real parent, she would have been in jail for child abuse and murder.

    Mother   Children   Real  
    "In Defense of Posthuman Dignity". Bioethics, Vol. 19, No. 3, nickbostrom.com. 2005.
  • It’s unlikely that any of those natural hazards will do us in within the next 100 years if we’ve already survived 100,000. By contrast, we are introducing, through human activity, entirely new types of dangers by developing powerful new technologies. We have no record of surviving those.

  • Human nature is a work in progress.

  • The cognitive functioning of a human brain depends on a delicate orchestration of many factors, especially during the critical stages of embryo development-and it is much more likely that this self-organizing structure, to be enhanced, needs to be carefully balanced, tuned, and cultivated rather than simply flooded with some extraneous potion.

    Self   Brain   Needs  
    Nick Bostrom (2014). “Superintelligence: Paths, Dangers, Strategies”, p.43, OUP Oxford
  • The greatest existential risks over the coming decades or century arise from certain, anticipated technological breakthroughs that we might make in particular, machine super intelligence, nanotechnology and synthetic biology. Each of these has an enormous potential for improving the human condition by helping cure disease, poverty, etc. But one could imagine them being misused, used to create powerful weapon systems, or even some kind of accidental destructive scenario, where we suddenly are in possession of some technology that's far more powerful than we are able to control or use wisely.

    Source: www.pbs.org
  • There are some problems that technology can't solve.

  • Machine intelligence is the last invention that humanity will ever need to make.

    "What happens when our computers get smarter than we are?". TED Talk, www.ted.com. March 2015.
  • In the next century, we will be inventing radical new technologies - machine intelligence, perhaps nanotech, great advances in synthetic biology and other things we haven't even thought of yet. And those new powers will unlock wonderful opportunities, but they might also bring with them certain risks. And we have no track record of surviving those risks. So if there are big existential risks, I think they are going to come from our own activities and mostly from our own inventiveness and creativity.

    Source: www.pbs.org
  • Knowledge about limitations of your data collection process affects what inferences you can draw from the data.

    Director of the Future of Humanity Institute Nick Bostrom, Nick Bostrom (2013). “Anthropic Bias: Observation Selection Effects in Science and Philosophy”, p.1, Routledge
  • How can we trace out the links between actions that people take today and really long-term outcomes for humanity - outcomes that stretch out indefinitely into the future? I call this effort macrostrategy - that is, to think about the really big strategic situation for having a positive impact on the long-term future. There's the butterfly effect: A small change in an initial condition could have arbitrarily large consequences.

    Source: www.pbs.org
  • For healthy adult people, the really big thing we can foresee are ways of intervening in the ageing process, either by slowing or reversing it.

    People   Healthy   Adults  
    "The ideas interview: Nick Bostrom" by John Sutherland, www.theguardian.com. May 09, 2006.
  • The first ultraintelligent machine is the last invention that man need ever make, provided that the machine is docile enough to tell us how to keep it under control.

    Men   Needs   Machines  
    Nick Bostrom (2014). “Superintelligence: Paths, Dangers, Strategies”, p.20, OUP Oxford
Page 1 of 1
We hope you have found the saying you were looking for in our collection! At the moment, we have collected 20 quotes from the Philosopher Nick Bostrom, starting from March 10, 1973! We periodically replenish our collection so that visitors of our website can always find inspirational quotes by authors from all over the world! Come back to us again!
Nick Bostrom quotes about: