[go: up one dir, main page]

English

edit
 
English Wikipedia has an article on:
Wikipedia

Alternative forms

edit

Etymology

edit

From long-term +‎ -ism.

Noun

edit

longtermism (uncountable)

  1. (ethics, philosophy) An ethical theory, generally associated with effective altruism, which prioritizes improving the conditions of the long-term or distant future, rather than focusing exclusively on the near term (as in neartermism).
    Antonym: neartermism
    • 2017, Benjamin Todd, “Longtermism: the moral significance of future generations”, in 80,000 Hours[1], retrieved 2022-04-13:
      When this thesis [termed earlier the 'long-term value thesis'] is also combined with the idea that some of our actions can have non-negligible effects on how the future goes, it implies that one of our biggest priorities should be ensuring the future goes well. This further idea is usually called ‘longtermism.’ […] The arguments for and against longtermism are a fascinating new area of research. Many of the key advances have been made by philosophers who have spent time in Oxford, like Derek Parfit, Nick Bostrom, Nick Beckstead, Hilary Greaves and Toby Ord.
    • 2022 April 13, Erik Hoel, “How to prevent the coming inhuman future: On stopping the worst excesses of AI, genetic engineering, and brain-tampering”, in The Intrinsic Perspective[2], retrieved 2022-04-13:
      There are a handful of obvious goals we should have for humanity's longterm future, but the most ignored is simply making sure that humanity remains human. It's what the average person on the street would care about, for sure. And yet it is missed by many of those working on longtermism, who are often effective altruists or rationalists or futurists (or some other label nearby to these), and who instead usually focus on ensuring economic progress, avoiding existential risk, and accelerating technologies like biotechnology and artificial intelligence—ironically, the very technologies that may make us unrecognizably inhuman and bring about our reckoning. […] So the future will be upon us faster than we think, and we need to start making decisions about what sort of future we want now. Longtermism gives a moral framework for doing this: it is the view that we should give future lives moral weight, perhaps at a discount, perhaps not; for whether one does or doesn’t discount hypothetical future people (vs. real people) turns out to be rather irrelevant. There are just so many more potential people than actual people, or even historical people who've ever lived.
    • 2022 December 9, Jennifer Szalai, “Effective Altruism Warned of Risks. Did It Also Incentivize Them?”, in The New York Times[3]:
      Effective altruists talk about both “neartermism” and “longtermism.” [] MacAskill repeatedly calls longtermism “common sense” and “intuitive.” But each of those terse sentences glosses over a host of additional questions, and it takes MacAskill an entire book to address them.
  2. (business, management) Synonym of long-termism (concentration on long-term goals rather than short-term security or advantage).

Hypernyms

edit
edit

See also

edit