To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
Languages
Recent
Show all languages
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Suffering risks

From Wikipedia, the free encyclopedia

Scope–severity grid from Bostrom's paper "Existential Risk Prevention as Global Priority"[1]

Suffering risks, or s-risks, are risks involving an astronomical amount of suffering; much more than all of the suffering that has occurred on Earth thus far.[2][3] They are sometimes categorized as a subclass of existential risks.[4]

Sources of possible s-risks include embodied artificial intelligence[5] and superintelligence,[6] as well as space colonization, which could potentially lead to "constant and catastrophic wars"[7] and an immense increase in wild animal suffering by introducing wild animals, who "generally lead short, miserable lives full of sometimes the most brutal suffering", to other planets, either intentionally or inadvertently.[8]

Steven Umbrello, an AI ethics researcher, has warned that biological computing may make system design more prone to s-risks.[5]

YouTube Encyclopedic

  • 1/5
    Views:
    8 268
    966
    10 475 601
    204 875
    298 324
  • Dystopian Futures of Astronomical Suffering | Documentary about S-risks and Longtermism
  • Tobias Baumann on Artificial Sentience and Reducing the Risk of Astronomical Suffering
  • A Reason To Stop Worrying - Watch This Whenever You're Stressed Or Anxious
  • 5. From Panic to Suffering
  • Does the Universe have a Purpose? ~ Consciousness Documentary

Transcription

See also

References

  1. ^ Bostrom, Nick (2013). "Existential Risk Prevention as Global Priority" (PDF). Global Policy. 4 (1): 15–3. doi:10.1111/1758-5899.12002. Archived (PDF) from the original on 2014-07-14. Retrieved 2024-02-12 – via Existential Risk.
  2. ^ Daniel, Max (2017-06-20). "S-risks: Why they are the worst existential risks, and how to prevent them (EAG Boston 2017)". Center on Long-Term Risk. Archived from the original on 2023-10-08. Retrieved 2023-09-14.
  3. ^ Hilton, Benjamin (September 2022). "'S-risks'". 80,000 Hours. Archived from the original on 2024-05-09. Retrieved 2023-09-14.
  4. ^ Baumann, Tobias (2017). "S-risk FAQ". Center for Reducing Suffering. Archived from the original on 2023-07-09. Retrieved 2023-09-14.
  5. ^ a b Umbrello, Steven; Sorgner, Stefan Lorenz (June 2019). "Nonconscious Cognitive Suffering: Considering Suffering Risks of Embodied Artificial Intelligence". Philosophies. 4 (2): 24. doi:10.3390/philosophies4020024. hdl:2318/1702133.
  6. ^ Sotala, Kaj; Gloor, Lukas (2017-12-27). "Superintelligence As a Cause or Cure For Risks of Astronomical Suffering". Informatica. 41 (4). ISSN 1854-3871. Archived from the original on 2021-04-16. Retrieved 2021-02-10.
  7. ^ Torres, Phil (2018-06-01). "Space colonization and suffering risks: Reassessing the "maxipok rule"". Futures. 100: 74–85. doi:10.1016/j.futures.2018.04.008. ISSN 0016-3287. S2CID 149794325. Archived from the original on 2019-04-29. Retrieved 2021-02-10.
  8. ^ Kovic, Marko (2021-02-01). "Risks of space colonization". Futures. 126: 102638. doi:10.1016/j.futures.2020.102638. ISSN 0016-3287. S2CID 230597480.

Further reading

This page was last edited on 9 May 2024, at 19:10
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.