Michael Bitton is a graduate student in Media Production in Toronto. For his thesis paper, he’s been researching effective uses of media for doing good in the world. He currently sees the most potential in health communication and social marketing in the developing world. You can find out about his writings at his blog, A Nice Place to Live. I had this conversation with him on 26 Feb via .impact, but thought it would be good to cross-post the conversation here. I’ve added a bit more from the comments I’ve received from others.
Note: These notes are a quick summary of our conversation and may not be all that coherent. They miss a lot of nuance and may not reflect statements that both people agree to.
What is the goal of marketing EA?
Move more existing donations to more effective charities
Create more donations
Get more people to think of “doing good” as a consideration for choosing their career
Create more people who will identify as effective altruists?
Create more vegetarians / vegans?
Who could the target audience be?
Right now, it seems like people either get EA immediately or not, as seen in currently unpublished GWWC marketing surveys. Though, as Ben Kuhn points out, this could be confabulation. He wouldn’t be surprised if people edited their memories so that they “got it” immediately. Ben used to think that he “got it immediately”, but now says in retrospect he doesn’t think that’s true. I personally think I got it immediately, even in retrospect.
Pablo Stafforini says that this could be tested by considering the lag between exposure to EA material and affiliation to an EA org, as it’s much harder to confabulate dates than about thought processes. Anecdotally, Pablo says he knows of many people who decided to contact GWWC and even sign the pledge after a few days of reading The Life You Can Save.
Michael Bitton thinks that equating “EA” with “signing GWWC’s pledge” is setting the bar way too high and that the bulk of the untapped population would only be persuaded to give “part-way.”
Fine Tuning the Message
If it’s true that there is a core group of people that will get it, reaching them is all that is important (“spray and pray” method), though we should check o make sure this isn’t misguided. It’s also possible that if we fine-tune our message, we can reach more people.
GWWC is a really big ask, and people might be receptive to smaller asks, like The Life You Can Save – comparing asking people to pledge at different rates would be informative.
The core EAs in Switzerland think that getting interested in science/rationality first may be more effective than pitching altruism. This works better for them and is their new strategy. Adriano Mannino knows more about this.
Potential Target Group: Secularist / Atheist / Skeptic community
Interested in secular morality
Friendly to consequentialism
Friendly to applying skepticism to charity
Friendly to thinking rationally / hearing arguments
Niel Bowerman thinks this would be a great group to tap into more heavily. His intuition is that many of them would “get” effectiveness and convincing them of the altruism component wouldn’t be all that difficult. He would be interested in hearing from anyone who has tried approaching these communities in more detail.
Potential Target Group: Devoutly religious
Pre-existing drive toward altruism
Willing to tithe their income / donate lots
May not identify with EA directly, but maybe a church could be persuaded to donate to Against Malaria Foundation or Give Directly, though it may be hard to persuade them away from religious / local causes.
Downside: the religious tend to be unfriendly to consequentialism
Niel knows a few very dedicated EA pursuing earning-to-give that have come through this route. Like other groups such as campaigners that are already heavily bought into the idea of altruism, my experience is that a much lower fraction are interested, but for the very few who ‘get’ EA they can become very dedicated.
What is our pitch?
Emphasize an “excited altruism” approach. While lots of current EAs found the moral-imperative approach compelling, the intuition is that a message of “self-sacrifice” might not be catchy for the mainstream. But it would be good to test further. Michael Bitton think people are driven mostly by low motives and thus their decisions will come down to factors besides merely identifying the morally right action. Plenty of people think vegetarianism is the morally superior option and yet eat meat anyway. Furthermore, the existence of current moral-imperative-EAs is much evidence that that pitch would persuade more people. It could be that the moral-imperative-people would naturally be the first ones to join, for some reason.
Gather more stories of people who are EAs. Emphasizing a human element might make it more appealing.
Concern that EA as community might backfire. Bitton says that people don’t like joining communities that have a value system. Furthermore, other things in the ea community could be a turn-off to some people. While the connection to utilitarianism is ok, things like cryonics, transhumanism, insect suffering, AGI, eugenics, whole brain emulation, suffering subroutines, the cost-effectiveness of having kids, polyamory, intelligence-enhancing drugs, the ethics of terraforming, bioterrorism, nanotechnology, synthetic biology, mindhacking, etc. might not appeal well. There’s a chance that people might accept the more mainstream global poverty angle, but be turned off by other aspects of EA. Bitton is unsure whether this is meant to be a reason for de-emphasizing these other aspects of the movement. Obviously, we want to attract more people, but also people that are more EA. He doesn’t have a good sense of how to approach decisions that involve trade-offs between these two desiderata.
Consider the fanbase analogy. As a start, Bitton thinks the above concern should be treated as analogous to a corporation deciding whether to appease its core fan base or the (more numerous) casual fans. I’m more interested in gaining “casual fans” (ordinary people that start to give a bit more or decide a bit better) than in further uniting the “core fan base” (the people that currently post in EA Facebook groups, on LessWrong, identify as EAs, read EA blogs, work on EA projects, etc.).
Concern the EA community is too homogonous. Even if EA is not a value system or ideology, it’s still a somewhat clear identity category (white, atheist, background in STEM, rationalist, “nerdy,” introverted, familiar with philosophy, dislike of continental philosophy, “innocent” in the sense of low rates of tattoos, piercings, smoking, heavy drinking, radical physical appearance, etc.)
Need more on-the-ground research to figure out how normal people think of charity (e.g., people donate to cancer charities because their parents died of cancer). Perhaps it would be immediately obvious that saving lives is more important than the arts, but people haven’t realized the things are in tension…
Need to learn more about what pitches work through A-B testing, focus groups, talk with marketers, etc., to create actionable findings that are generally useful for the whole movement.