longtermist

From Wiktionary, the free dictionary
Jump to navigation Jump to search

English[edit]

Alternative forms[edit]

Etymology[edit]

From long-term +‎ -ist.

Noun[edit]

longtermist (plural longtermists)

  1. (ethics, philosophy) A believer or follower of longtermism (an ethical stance which gives priority to improving the long-term future).
    Coordinate term: neartermist
    • 2022 September 9, Theodore Leinwand, “Neartermism and longtermism aren’t at odds”, in The Washington Post[1]:
      Longtermists are in daily conversation with neartermists. It’s a red herring to argue that “abandoning what would most help people on Earth today isn’t exactly ethically sound.”
    • 2022 November 17, Annie Lowrey, “Effective Altruism Committed the Sin It Was Supposed to Correct”, in The Atlantic[2]:
      Some longtermists, for instance, argue that we need to balance the need to address climate change now with the need to invest in colonizing space; they encourage us to think on a billion-year timescale.
    • 2023 July 22, Andrew Anthony, “The pro-extinctionist philosopher who has sparked a battle over humanity’s future”, in The Observer[3], →ISSN:
      It’s an article of faith among longtermists that an event that led to the loss of 99% of humanity would be vastly preferable to one that kills off 100%.

Adjective[edit]

longtermist (not comparable)

  1. (ethics, philosophy) Of, pertaining to or supporting longtermism (an ethical stance which gives priority to improving the long-term future).
    Coordinate term: neartermist
    • 2022 December 9, Jennifer Szalai, “Effective Altruism Warned of Risks. Did It Also Incentivize Them?”, in The New York Times[4]:
      Effective altruists talk about both “neartermism” and “longtermism.” Bankman-Fried said he wanted his money to address longtermist threats like the dangers posed by artificial intelligence spiraling out of control.
    • 2023 July 22, Andrew Anthony, “The pro-extinctionist philosopher who has sparked a battle over humanity’s future”, in The Observer[5], →ISSN:
      Earlier this year, the AI theorist Eliezer Yudkowsky, who is associated with longtermist thinking, wrote an op-ed for Time magazine in which he argued that the world should not just institute a moratorium on artificial intelligence development but also be prepared to use nuclear arms to shut down large rogue computer farms that flouted the moratorium.

Related terms[edit]

See also[edit]