Be the first to like this
Any conclusion that "Pursuing cause A also just so happens to be the thing that best optimises for cause B" should give us pause: The probability of our reasoning being biased is very high.
Indeed, it's likely that the optimal thing to do to affect the far future generally lies in the space of suffering-focused AI safety research and/or outreach for this cause directly (cf. the research done by the Foundational Research Institute (FRI): www.foundational-research.org). However, if you have a comparative advantage in broader movement building, a decent case can be made that focusing on animal (meta-)activism might be optimal: Spreading concern for the suffering of voiceless non-human minds (including artificial ones) plausibly affects our civilisation's long-run trajectory to an extent. Moreover, building a sufficiently cause-neutral animal movement has sizeable spill-over effects into direct far future causes. The majority of current FRI researchers have animal rights backgrounds and considered the animal cause the main contender for top priority before looking into far-future strategy and AI safety.
While it does seem quite crazy to believe that more down-to-earth EA and particularly animal movement building miraculously optimises for far-future impact too, the view that we can update a sufficient number of people to weird far-future causes directly seems just as crazy. Many thoughtful people who tend to prioritise far-future concerns these days were very averse to far-future issues when the first heard about them.
Last but not least, moral pluralism, cooperation and compromise might recommend animal movement building (cause-neutralish and with an eye on far-future concerns: www.sentience-politics.org/philosophy): It's awesome for reducing suffering short-term and comes with great spill-over into far-future causes. Being more cause-neutral and far-future concerned is the main thing that sets Sentience Politics apart from other animal EA orgs – and it's an extremely crucial thing in my view.