Content from my Philanthropy 2173 Blogspot feed


The future of good

FastCompany Magazine "Best Blog"
Huffington Post "Philanthropy Game Changer"

                                                    Photo by Richard Stovall on Unsplash

Democracy in the USA is not "naturally" withering, it is under attack. And the call(s) are coming from both inside the house and outside, domestic and foreign. One source of attack is the Republican Party. Threats can't be beaten if they aren't named. I strongly suggest both foundations, their associations, and their media stop "both-sidesing" this and call out the threats to the sector that are coming from their own.

First and foremost, Donald Trump's campaign has declared it will be "taxing, fining, and suing excessively large private university endowments" to fund its own propaganda-driven alternative university. Now, big private universities don't usually inspire a lot of sympathy, I get that. I'm an alum of them and they don't make me all warm and fuzzy. But be clear, none of this has anything to do with anti-semitism (which gets a quick shout-out in the document linked above). It's part of a sustained campaign against perceived liberal or left(ish) civil society. The presumed candidate of the Republican Party is promising/threatening to seize endowment assets from universities it doesn't like. I'll say it again, the GOP is running on a platform that involves taking funds away from nonprofits it doesn't like. If that doesn't make the philanthropy industry stand up and take notice (and, one might hope, action), I can't think of a bigger threat that the sector would be ignoring. And this from a candidate who's been repeatedly sued for the way he ran his nominal foundation

All nonprofits and foundations, their professional and lobbying associations, and the media dedicated to them should decry a platform such as proposed in Agenda47. And, what's that I hear? Yup, crickets.

Or worse, InsidePhilanthropy worked hard on this rundown of funding for democracy, (behind their paywall yell at them, not me). It's good reporting on a survey done by the Democracy Fund that focuses on giving to democracy efforts and causes related to it. But it counts funding on just one side of the equation. It counts funding by funders in the political center or on the left. It doesn't count the other side - there is no accounting of efforts to undermine democracy. The story mentions book bans, school board fights, and transgender bathroom hysteria as examples of undemocratic philanthropy. But it neither tallies the amount of philanthropic dollars spent on these issues nor names any of the funders. That's not helpful. Those are philanthropic dollars going to efforts that undermine democracy - and they're by no means all the way such money is being spent (Supreme Court favors, anyone? Social media trolls, disinformation, and campaigns such as that run by Christopher Rufo with help from Congresswoman Stefanik to oust female college presidents of color? The list is long)

Attacks on democracy are secretively well-funded even as they appear to be led by grassroots individuals. Counting the funding on the pro-side and not on the attack-side makes it seem as if the  attacks are just part of the process of democracy. And that may be true. But if its true its true in the sense that democracy will always have critics, and some of those will be doing their best to destroy democratic participation by those they don't like.

One of the two political parties in this country is running on a platform that includes seizing endowment assets. Yes, the campaign platform of the GOP is "vote for us and we'll put government in charge of higher education and destroy some of the nation's longest-lived independent institutions. For all the vitriol these universities attract, there's a helluva lot of rich people trying hard to get their kids admitted to them).You may not feel sorry for Harvard, but you'd be a fool for thinking this is just an attack on the Crimson. That's what the GOP wants you to think, but it's not (all) they want to do.

If foundations, philanthropy, and nonprofits don't stand up to defend civil society from Agenda47 before November, they'll deserve what happens, post-election.

Author: Lucy Bernholz
Posted: February 14, 2024, 12:22 am

                                                    Photo by Enrique Macias on Unsplash

Open source technology has a long history of being a counterbalancing force to closed, proprietary systems. For decades it was open source versus corporations. Then Microsoft (closed, proprietary) bought GitHub (most used repository of open source code). Today, in the AI battles, Facebook/Meta, IBM and Oracle, along with universities and the National Science Foundation, announced the AI alliance - dedicated to open AI models. This is part of the larger debate about building responsible/trustworthy/safe/ethical AI. 

So some of the world's biggest tech companies, many who have thrived on proprietary, patented, trademarked and close source code, are now arguing that an open community of developers is the way forward to protect us from the harms of AI.

This is one more step in both the commercial battles for market dominance and the definitions of words such as safety, ethical, trustworthy and responsible (in the context of AI.) For example, effective altruists and longterm (ers) (ists) focus on the word "safety." They're bogeyman is the potential for AI to destroy humanity. This group, the AI Alliance, uses the terms "open" and "responsible." They're bogeyman appears to be the other companies who've already launched proprietary models - like Google and Microsoft.

The mix of organizations and funding in these AI debates includes corporations, governments, and numerous nonprofits - not only universities, but also groups of developers and advocacy organizations. Philanthropic funding is very much in the mix. The direction of AI development is not simply an external force acting upon the nonprofit/philanthropic sector; it is being shaped by numerous actors within the sector. The meaning and purpose of "open" in this context is neither static, nor simple.

Author: Lucy Bernholz
Posted: December 6, 2023, 7:46 pm

                                                                            M.C. Escher, Relativity Stairs    

Imagine a large - no, bigger, much bigger - nonprofit hospital, university, housing developer, or after school program. Bigger by assets than any other. Right now, there are 13 universities in the U.S.A. with more than $10 billion endowments (one of which is a "public" university), with the largest topping $50 billion. Bigger than that. 

There is one. OpenAI. Though its size is not based on endowed assets but rather speculative stock value, the organization, which is still as of this writing a nonprofit, is valued at $86 Billion. It's not clear that the organization will continue with its current structure - the events of the last few weeks resulted in a new board and promises to revisit the structure.

Others have written about what the weeks' events mean for the development of AI going forward, the effective altruism (paywall) movement, tech bros, and capitalism. I want to think about what it means - if anything - for civil society. 

First, it seems that no one in civil society or the U.S. nonprofit sector really sees the organization as anything other than a commercial firm (it has a capped profit structure, which limits the amount of profit to be returned to shareholders, but only designates profits to be reinvested in the organization (as nonprofits due) after investors are paid out). 

I can understand this view, sort of. The sector in the U.S. (as represented by its lobbying/advocacy/infrastructure groups) is still hung up on a certain kind of charitable corporation, designated as 501c3 (OpenAI is such), and doesn't pay much attention to the dozens of other structures that identify as nonprofits. Heck, it's hard to get these groups to address the problematically porous nature of c3s and c4s, they're way behind the eight ball in understanding they swim in a sea filled with informal associations, "Slack"-based "organizations" for mutual aid or volunteering, B corporations, or hybrids. So, perhaps its way too much of an ask to expect recognition among their own of the behemoth of technology development. 

Second, the OpenAI events show that the nonprofit governance model is not "strong" enough to outweigh the interests of investors. Given the model's purpose in this situation, and the information that's public, the nonprofit board that fired the CEO was acting as it was intended. I guess no one thought they'd actually do what they were set up to do. 

Third, while the argument for data trusts has largely focused on the difference between digital assets and analog ones as the reason for a new organizational form, they're still rare and probably outnumbered by hybrids of profit/non-profit forms. The AI world - especially that which professes some commitment to "ethics", "safety," "responsibility" or "trustworthiness"* - is ripe with hybrids, not trusts. But they're not limited to this field - they're plentiful in journalism, for example. I highlight this in the forthcoming Blueprint 24.

Fourth, it's not just the structure of the organization that matters, it's also the structure of the funding. Many supporters of the AI organizations we captured for our dataset (live link on December 15, 2023) are contributing via deductible donations and commercial investments. The more the donor class uses LLCs and family offices, the harder it is to determine what kind of funding they're putting where. While those who invested for a financial return in OpenAI may be happy with the result of the last few weeks, what about those who donated with an eye on the mission? 

Fifth, philanthropy is playing a not insignificant role in these developments. Individuals and organizations associated with effective altruism fund at least 10% of the 160+ AI organizations we track in Blueprint24. They're funding for AI policy fellowships and internships is particularly notable, as these individuals are now well-represented inside policy making bodies. In a very short time, philanthropy has had a significant impact on the development of a major industry, its regulatory overseers (at least in the U.S.A), and the public discourse surrounding it. Had this happened in education, healthcare, or other domains where philanthropy is active we'd see the industry press and professional associations paying close attention (and claiming all kinds of credit). Yet, as noted in the intro, voices in civil society and philanthropy have been awfully quiet about this "impact" on AI.

As someone who has been tracking and explicating the changing nature of organizations in civil society, I see OpenAI as a huge, well-publicized example of something that's been going on for awhile. The nonprofit sector ain't what you think it is. And it's codified boundaries - the legalities that distinguish nonprofit corporations from commercial ones - may not be up to the task of prioritizing mission over financial returns when the assets are digital, the potential for profit so hyped, and the domain (AI development) easy to make seem arcane and "too hard for you to understand" by insiders.

*These are some of the phrases that are being used in the debates over AI development. It's critical to keep an eye on these terms - they don't all mean the same thing, they are used interchangeably though they shouldn't be, and some of them are being used to deliberately gaslight the public about our options when it comes to developing these technologies. Just as political propagandists excel at hijacking terms  to denude them of power (see, for example, "fake news"), so, too, do commercial marketers or ideologues excel at using phrases like "safety" to seem universally meaningful, thus providing cover for all kinds of definitions. See Timnit Gebru and Émile Torres on TESCREAL.

Author: Lucy Bernholz
Posted: November 30, 2023, 7:35 pm

                                                                        Photo by Brett Jordan on Unsplash

Has a philanthropic strategy ever before become an identity? I'm confident that neither John D. Rockefeller nor Andrew Carnegie ever referred to themselves as scientific philanthropists - names which historians have applied to them. I've heard organizations tout their work as trust-based philanthropy, but yet to hear anyone refer to themselves that way. Same with strategic philanthropy. And even if you can find one or two people who call themselves "strategic" or "trust based" philanthropists, I'm confident you can't find me thousands.

Effective altruism, on the other hand, is all three - ideology, identity, and philanthropic approach. 

Given the behavior of Sam Bankman-Fried and his pals at FTX, it's also a failed cover for fraud. But I digress. 

In the upcoming Blueprint24 (due out on December 15 - will be free and downloadable here) - I look at the role of Effective Altruism in the burgeoning universe of AI organizations. I had two hypotheses for doing so.

H1: There are 00s of new organizations focused on "trustworthy" or "safe" AI, but that behind them is a small group of people with strong connections between them. 

H2: These organizations over-represent "hybrids" - organizations with many different forms and names, connected via a common group of founders/funders/employees - for some reason.

The Blueprint provides my findings on H1 and H2 (yes, but bigger than I thought, and yes, and I give three possible reasons) and will also make public the database of organizations, founders, and funders that a student built for me. So the weekend drama over at OpenAI certainly caught my attention.

By now, you've probably read about some of the drama at OpenAI. As you follow that story, keep in mind that at least two of the four board members who voted to oust the CEO are self-identified effective altruists, as is the guy who was just named interim CEO. These are board members of the 501 (c)(3) nonprofit OpenAI, Inc.

Effective Altruism's interests in AI run toward the potential for existential risk. This is the concern that AI will destroy humanity in some way. Effective altruists also bring a decidedly utilitarian philosophy to their work - to the point of having calculated things like the value of a "life year" and a "disability-affected life year" and use these calculations to inform their giving.* 

The focus on existential threats leads to a couple of things in the real world in real time. First, it distracts from actual harms being done to real people right now.  Second, the spectre of faraway harms isn't as motivating to action as it should be - see humanity's track record on climate change, pandemic prevention, inequality, etc. Pointing to the far away future is a sure way to weaken attention from regulators and ensure that the public doesn't prioritize protecting itself. Third, far away predictions require being able to argue how we get from now to then - which bakes in a bunch of steps and processes (often called path dependencies). Those path dependencies then ensure that what's being done today comes to seem like the only things we could possibly be doing.

Think of it like this: if I tell you we're going to get together on Thursday to give thanks and celebrate community. From this, we'd decide OK, we need to buy the turkey now. Once we have a turkey, we're going to have to cook it. Then we're going to have to eat it. Come Thursday, we will have turkey, regardless of anything else. We've set our direction and there's only path to Thursday.

But what if instead, I tell you we want to get together on Thursday to celebrate community and give thanks. But we want to make sure that everyone who we will invite has enough to eat from now until Thursday as well. We'd probably not buy a turkey at all. Instead, we'd spend our time checking in on each other's well-being and pantry situation, and if we found people without food we'd find them some. We can still get together on Thursday, comfortable in knowing that everyone has had their daily needs, met. In other words, if we focus on the things going wrong now we can fix those, without setting ourselves down a path of no return. And we still get to enjoy ourselves and give thanks on Thursday.**

The focus on long term harms allows for the very people who are building the systems to keep building them. They then model themselves as "heroic" for raising concerns while they simultaneously shape (and benefit from) the things they're doing now. Once their tools are embedded in our lives, we will be headed toward the future they portend, and it will be much harder to rid ourselves of the tools. The moment of greatest choice is now, before we head much further down any paths. 

It's important to interrogate the values and aspirations of those who are designing AI systems, like the leadership of OpenAI. Not at a surface level, but more deeply. Dr. Timnit Gebru helps us do this through her work at DAIR, but also by doing some of the heavy lifting on what these folks believe. She provides us with an acronmyn, TESCREAL, to explain what's she found. TESCREAL (the bonus buzzword I promised) stands for "Transhumanism, Extropianism, Singularitarianism, Cosmism, Rationalism, Effective Altruism, and Longtermism." Listen here to hear Dr Gebru and Émile Torres discuss where these terms come from. And don't skip over the part about race and eugenics.

Effective Altruism is much more than a way to think about giving away one's money. It's an ideology that has become an identity. A self-professed identity. That reveals a power, an attraction in the approach that is unmatched, as far as I can tell, in the history of modern philanthropy. At the moment, this identity and ideology also seems to have a role in the development of AI that is far greater than many have realized. It's critical that we understand what they believe and what they're building.


*As someone with a newly acquired disability, I'd be curious about their estimation of the difference between a "life year" and a "disability-affected life year" if I wasn't already so repulsed by the idea of the value of either value.

**Agreed, not the best metaphor. But maybe it works, a little bit?

Author: Lucy Bernholz
Posted: November 20, 2023, 9:38 pm

Photo by Ruy Reis on Unsplash

A headline in today's Chronicle of Philanthropy, reads:

"Philanthropy’s Job in Polarized America: Make Partners of Enemies, a New Poll Says"

Which raises an obvious question, 

"Why do you think philanthropy is the solution and not part of the problem?"

We often talk about civil society and philanthropy as if they only do good. And then we go on to debate the meaning of good. While that can be hard, we're often pretty clear we know what it isn't when we see it. 

So when I see headlines about Project 2025 - a coordinated effort by more than 80 nonprofit organizations (both c3s and c4s) to put loyalists to Donald Trump in positions up and down government and across state and federal jurisdictions - I don't just doubt the willingness of these groups to "make partners of enemies." I doubt the willingness or ability of groups on the democratic side of the ledger to do so either. I also doubt the willingness of most media outlets, almost all of which seem to have become aligned with one political side or the other. 

I've written a lot over the years about the blurring of the lines between charity and politics. This is most clear in the way funding now works - flowing between c3s and c4s, coming out of donors' LLCs and DAFs. The money moves in ways that removes donors names from donations and goes in and out of organizations in between reporting dates, which often come long after the money has been used. As I first wrote following the Citizens United decision in 2010, the scale and appeal of political money will be too much for charitable nonprofits to ignore. In taking such money, and even perhaps in trying to ignore such funds, nonprofit activities are increasingly aligned with one political side or the other.

We need better mechanisms for tracking money through nonprofits and into political activities. We need to be able to follow dollars into politics, no matter what kind of organization they flow through. We need to be able to track and report this funding in more useful time frames than oft-delayed tax filings. And, we need to be more honest with ourselves and in our writings about civil society and philanthropy. Which requires acknowledging that some (measurable, but not yet measured) percentage of both funders and nonprofits are deliberately pursuing political ends while masquerading as nonpolitical entities. Only when we acknowledge this reality can we begin the process of writing new rules for reporting, transparency, legitimate activities, and meaningful accountability. Which, of course, helps explain while the sectors themselves aren't necessarily interested in acknowledging this reality.

Philanthropy and nonprofits are small p political. Your theory of change, the problems you choose to address, and the ways you seek to solve them reveal political assumptions and allegiances. This has long been true. Now, as the ideologies and paths to change proposed by the country's two political parties grow ever further apart from each other, these associations become more obvious, more visible. Add to this the constant growth in political giving, and it seems that civil society is growing increasingly capital P political, and that at least some of that is due to the preferences of funders. It's hard for me to see how any of this positions civil society or philanthropy as the recourse to social and political polarization.

There are things that we can do to bridge our differences. But we should first recognize just how broadly our political differences influence things like where we live, work, shop, read, worship, play, travel, and donate our time and money. And not assume that every philanthropic or nonprofit organization is interested in or equipped to help with that bridging. It seems that some portion of them are quite invested in exactly the opposite.

Author: Lucy Bernholz
Posted: November 14, 2023, 12:54 am


                                                          Photo by JJ Ying on Unsplash

Ah, AI. Can't avoid it. 

I've been to the conferences and workshops, read the listservs, talked to the researchers and read some of the research, played with the public tools. The Blueprint 2024 lays out my thoughts on nonprofits, philanthropy and AI for 2024. 

This coming Blueprint (available live and free on 12/15/23) skips the prediction section - and explains why. But I have some thoughts on how AI is going to unfold in the sector, especially after checking out this new resource from Giving Tuesday - the AI Generosity Working Group.

Year 0-1:  November 1, 2022 - 2023 - hype, fear, webinars, and conference talks. Lots of press. Lots of handwaving. Gadgets. Lots of executive orders and unfunded government mandates and policy proposals pushed by tech companies.

Year 2 - 2024:  More hype, lots of feel-good examples (the Red Cross is using it! AI for disasters!) and a few scandals (lawsuits over data use, data loss, etc) will fill the news. Lots of nonprofits will try things and realize they don't have the expertise on staff, are distracting resources from mission, and will go back to ignoring the topic. By this time, we'll all be using AI all the time, however, as AI capacities will be fully baked into every software product you already have - every microsoft product, Canva, Zoom, Salesforce. We're already there, actually.

Years 3 - 5: Certain domains will achieve breakthroughs with AI. These are most likely to be medical research, tech development itself, environmental analysis (including analysis of the damage AI does to the environment in terms of water usage and power consumption). Advocacy organizations working on human issues from migration to healthcare, education to food benefits, will be up to their eyeballs in litigation and integrated advocacy efforts with digital and civil rights groups for harms caused by AI. My hopeful self says nonprofits and foundations will get fully on board with data governance needs (finally) as either litigation, regulation, or insurance premiums require them to manage their data better. AI - as the scary bogeyman/breakthrough opportunity - will help organizations finally understand what data governance is about. 

Years 3 - 5: AI nonprofits and philanthropy will be "things." Product launches of AI-driven giving advisors, AI-driven advocacy campaigns, AI+Blockchain smart contract organizations in the social sector. Most, if not all, will be hype and bust. 

Year 4 +: AI will be so thoroughly baked into every commercial product on which the social sector and philanthropy depend that we'll no longer talk about it much. It would be like discussing cell phones - everyone will have it somewhere in their organizations, new expectations will emerge because of its prevalence, and we'll not be talking about it as much.

As individual organizations become dependent on AI-powered software tools, we'll reach the next level of concern - the existing regulatory regime for nonprofits and foundations will be leaking and breaking, and proposals for new structures and laws will be circulating. The sector's policy advocates will bemoan their missed opportunities, back in 2023 and 2024, to influence the regulations on AI itself. The blend of nonprofits and commercial activity and/or nonprofits and political activity, will complicate such new debates. By this time, the academy and independent research groups like AJL or DAIR will have repeatedly documented harms caused by AI and have proposed numerous remedies.

Having been ignored by industry for 4+ years, we'll see new attention to these ideas. We'll also see a burst of former AI company employees "whistleblowing" or "following their consciences," leaving industry and setting forth to solve the problems they helped create while on the inside. By the time this happens, everyone will be used to and dependent on their AI-enabled tech, and even those who are eager to stop using it will find it "too difficult" to change their tech.

Some of the above is tongue in cheek. But, like the Gartner hypecycle, this loose set of predictions is based on the experience of other breakthrough technologies. It's probably too linear - and doesn't take into account the innumerable "wild card" events that are likely to occur between now and 2028. In other words, by 2028 we'll be having the debates about AI that we had about social media in the 2016 election. Some of these we're already having - especially with regard to elections - and that's a good thing. But it's not going to stop, or even redirect, this flow of events.

It doesn't need to unfold this way at all. Sadly, I don't see enough activities, organizations, advocacy, push back, regulatory oversight out there to prevent this (all too familiar) pattern from playing out. And certainly not compared to the dollars that are being spent now by corporate marketing departments to hook nonprofits.


Author: Lucy Bernholz
Posted: November 9, 2023, 9:14 pm

                                                        Photo by Uriel Soberanes on Unsplash

I had a chance to speak on a zoom panel today. In the before times, this would not be worthy of a comment - it's what I did. Dozens in a year. However, since getting covid that became long covid, I haven't been able to do...much. Between managing the illness, doctors and tests, and staying employed I'm at my max. 

It was fun to chat with folks. The event was hosted by Stanford Alumni Association, Stanford Alumni in Public Service, the Latino Alumni Association, and a few others - there is a recording but I don't know if it will be available beyond the hosts' networks. 

During the course of the hour, I got asked about AI and philanthropy. The Blueprint 2024 (coming on December 15, 2023) has much to say about this. But during the panel I realized something I've thought about for years, but don't think I've said before. 

Here are two true things:

1. In the past 20 years there has been a lot of innovation in digital tech and a lot of tech has been applied to giving - crowdfunding, text giving, online donations, information sources, giving platforms, etc. 

2. Participation rates in charitable giving in the U.S.A. over the last 20 years have gone down. (Total giving keeps going up, but that's from more rich people making more big dollar gifts.)

More tech. Fewer givers.* No one invested in giving innovation wants to hear that. Because, sadly, innovation in giving has become synonymous with throw some tech at it. If "innovation" is supposed to lead to "more," then it's not working. 


Another thought - one I also share in more detail in the Blueprint 2024

Lots of people are focused on AI. There are vendors and others in the social sector who are eager to sell you some AI powered gizmo to improve your fundraising (this is biggest market at moment). Here's the catch:

  1. AI gets trained on data
  2. The best data we have, on financial gifts to nonprofits, is wildly incomplete, misses out on many kinds of giving, is culturally misaligned for many givers, misses everything having to do with political giving, and ignores a great deal of the giving that other tech makes possible - for example, most crowdfunding or direct gifts to individuals 
  3. So today's AIs are being trained on yesterday's bad data
  4. This is a good thing, how?  How will it help you in the future?

More on this in the #Blueprint. But happy to talk about it before then. I don't hang out on the site formerly known as twitter anymore - it's a bit too much like late night at a KKK frat party for me. You can find me on LinkedIn, Bluesky and Mastodon - c'mon over, we'll chat.

*Yes, there are problems with the data. I know that you know that I know this. I write about it all the time. In the "truth" above I'm drawing from GivingUSA data and analyses thereof over time. So the most basic data we have on giving - financial contributions to 501 c 3 nonprofits. It's not the whole story by any means, but it's pretty comprehensive and accurate for what it is.

Author: Lucy Bernholz
Posted: October 26, 2023, 12:00 am

 I am so excited about this. Join us!

The speaker series is open to anyone who wishes to engage with critical insights on the intersections and implications of digital dependencies with democratic norms and civil society values and actors. It is structured a as hybrid experience, allowing you to choose whether you’d like to attend in person or join us virtually. Light refreshments and snacks will be provided.


Register here to join in-person or register here to join the event virtually. Click here to learn more about our upcoming events.
Author: Lucy Bernholz
Posted: September 15, 2023, 6:00 pm

                                                    Photo by Jon Moore on Unsplash 

If you've been wondering why the rhetoric around AI sounds so familiar, I have some thoughts. 

If you read Nancy Maclean's 2017 bestseller, Democracy in Chains, and then pick up a newspaper (or open a news company's app) and read this story on funding for AI scholarship at elite universities across the country, you will notice that the funders/philanthropists in the news story are using the playbook developed by those in the historical study. 

Democracy in Chains is about the fueling of libertarianism and a political economy that favors the wealthy few - an undemocratic project based on perverting majority-based systems to serve a very rich, very determined self-interested few. It goes further than Jane Mayer's brilliant Dark Money to show the intellectual history and the broad reach of the nonprofit/think tank/university (in other words, nonprofit) infrastructure for turning ideology into public policy. MacLean's book was published in 2017 and it centers on the Koch brothers - an updated version could factor in a wide range of philanthropic/funder/investor actions from younger billionaires and include otherwise-inexplicable actions such as Musk's purchase of and destruction of Twitter, and the general weirdness (horror) of First Amendment jurisprudence (FAIR v Harvard, UNC). When we are searching to make sense of a present moment it is helpful - extremely so, in this case - to look to both short and long-term historical precedents.

When it comes to our current moment (in the U.S.) in which Supreme Court decisions seem to abandon procedural and substantive norms from one day to the next and we're all rapidly trying to learn to distinguish AI-generated text/photos/videos from those made by humans and everything from the weather to the role of elections in this democracy seem up for grab these historical events are helpful. It's not quite rhyming (as historians will remind us), but there are patterns to see that can be helfpul. Maclean shows a 50+ year arc of an ideologic project built around a minority-viewpoint that has yielded extraordinary, stealthy success. It's worth understanding those past patterns to understand our current setting.

It's no coincidence that today's funders focused on existential risks of AI are using the playbook of scholarships, fellowships, and academic centers to build cadres of like-minded thinkers.  It focuses your attention downstream, away from the present. This funding model works - especially if you take a multi-decade time frame.

Just because it "works," however, doesn't mean it is in the best interest of anyone but those funding and being funded. The Kochs' and their allies were very clear that their project benefitted a minority (wealth owners). What they needed to do was bend the systems of a majority-based democracy to serve minoritarian ends. This was not hard to do, since the U.S. Constitutional system has numerous minoritarian run-arounds (e.g., Senate apportionment, electoral college, voting rules) built into it.  We should be on the lookout for similar motivations and efforts as we think about our now AI-dominant online information sources, systems, and messa

Some of those engaged in discussions and training about existential AI risks will note that human extinction is likely to come faster from climate change, weaponized nuclear facilities, and perhaps the next pandemic then from man-hating robots. Focusing scholars and the media's attention on the potential long-term harms to all of humanity is a slick way of distracting those same communities and others from the here-and-now harms of AI-enabled disinformation, discrimination, and economic harms for people already marginalized by race, religion, identity, and/or income. Each moment that goes by in which near-term harms are ignored is another chance for the current powers to further implant, strengthen, and reap the rewards of the very path dependencies that lead to the future they claim to be fighting against. 

In short, beware the arguments of those who direct your attention to far-away catastrophes while they benefit by building those very systems now. Better to refuse, redirect, or rebuild systems that cause no harm now, for they will also cause less harm later.

Author: Lucy Bernholz
Posted: July 5, 2023, 6:29 pm

Screenshot from

You know that the PGA Tour is a nonprofit, don't you?*

I'm also sure you've heard the news that the Saudi government (via its public investment fund, with $600+ billion in assets) launched a new tour (called LIV) which has announced a merger with the PGA Tour. Details are being worked out (and investigated.)

Why does this deal stink so much? Sport washing by a country with a dismal human rights record is pretty obvious - especially as the country is unabashadly trying to buy soccer talent also. Certainly, families of people who were killed on September 11, 2001 are disgusted (my own included). There's a lot of media on this story about the players, the fans, the public, the sport-washing, human rights, and, of course, Trump Sr., Kushner Jr., and Mnuchin. I'll let you read all that elsewhere. 

Let's go to back to the role of the Tour as a nonprofit organization. If you check on (screenshot above) you'll find the PGA Tour with its $4 billion in assets as well as about a dozen other PGA-named nonprofits, including a 501 (c) (3) foundation with$10,000 in assets and an organization for and by the wives of PGA players

This comes along as the United States has lost control of our system for financing campaigns and the regulatory body in charge (the FEC) is hogtied by politics. Money flows from individuals and corporations to nonprofits, where the names of the donors are "washed off" and the money is passed through to politically-active affiliated organizations. Sometimes, people just "move" nonprofit funds to their own pockets. As I predicted in 2010, when the Citizens United decision was handed down, large swaths of nonprofit organizations have become money laundering mechanisms for politics. This structure - foreign government "investment" in a nonprofit that holds extravagent and expensive events at properties owned by an indicted former president running again for office - looks and smells like the making of a money washing scandal from here, before the deal is even done. 

The new entity ("NewCo" to be born from PGA + LIV) will be a commercial enterprise. Owned by the nonprofit PGA. I'm not a lawyer but I can read these signs - that means no conversion foundation or tax payback from the nonprofit. Massive commercial investments plus a nonprofit structure that will enable anonymous financial flows. A set of nesting doll organizations ripe for funding abuse by anyone, anywhere interested in political influence, but particularly convenient for foreign governments. Given the timing, expect big concerns about funding and influence in the 2024 Presidential election.

Given the cast of characters involved, I'll say it out loud now: this deal looks like the biggest money laundering machine yet to be carved out of the nonprofit tax code. I'll put my bet down now - If the deal goes through, this will become a story of campaign finance violations. And we're watching it being put together right in front of us. It may never happen due to antitrust and other reasons, but still, it's important to see what this deal intends, and realize if not this, then somewhere else.

*I'm sure you remember that the NFL was a nonprofit until 2015 - when it reorganized as a commercial entity. Happens under 501 (c) (6) of IRS Code.

Author: Lucy Bernholz
Posted: June 8, 2023, 7:53 pm

Photo from Possessed Photography on Unsplash

Dateline: May, 2027

Location: Pretty much anywhere on earth

“Miriam was one of those rare people who could remember reading about her cause of death                                                                         before it happened. It wasn’t the reading that was rare - the warning had been printed in The New York Times, page A9. It also wasn’t the dying that was rare - hundreds of thousands of people would die of the same cause. It was the remembering that was rare.”

Yes, that’s fiction. I just made it up. Because I just read this story in today’s New York Times: record heat between now and 2027 due to climate catastrophe and El Niño weather patterns. It’s likely that one of the years between now and then will cross the mark of 1.5 degree celsius hotter than 19th Century average. 

So, there’s the science. The article goes on to do the work - “This will have far-reaching repercussions for health, food, water management and the environment.” 

Keep going - do the rest of the work: Those far-reaching repercussions mean fires, droughts, floods, food shortages, hunger, water wars (term used deliberately). These things mean death. I made up Miriam and I interpolated from the global recent past to get to “hundreds of thousands of deaths.” (We’ve passed the tens of thousands marker). Here’s what’s happening now - four years after devastating 2019 Australian summer. 

If you have children starting elementary school this Fall, 2027 will be here before they go on to middle school. If your child was accepted to a four-year college this Spring, they’ve just been welcomed in to the class of 2027. If you’re writing a five year (?) strategic plan for your foundation/nonprofit you’re planning this precise timeline of these disasters - how are you fitting them into those plans?

I wrote a wee bit of fiction from this news. (I’ve done some other things, actual prep. Which given the global nature of the prediction is challenging) How do we respond to predictions like this - Action? Stasis? What are you doing? What can we do together? 

Author: Lucy Bernholz
Posted: May 18, 2023, 8:48 pm
We're in an incredible moment. After decades of research and advocacy and warnings we are now living through the weather and natural disaster effects of climate collapse. We're also more than a few meters down the pitch of living with pervasive artificial intelligent systems. 

Ways of life from agriculture to writing, architecture to transportation are transitioning. The practices for adapting to more sustainable, more energy efficient, lower impact methodologies are being refined, shared, modeled and implemented at scale in some places. 

 And then there's this (which I reprinted with permission in the Blueprint 2022)

My question is are there examples of philanthropy that are clearly rooted in a sense of transition from one state to another? There are funds named for transitions - or at least there is the Just Transitions Fund - but are there others? If there are, what defines them? What are they transitioning to? Where are the experiments, innovations, regulatory reconsiderations, imaginaries, and alternatives in philanthropy and civil society that make use of (but don't venerate) our current capacities (for almost instant global communication, for example) and that pursue a vision of human thriving on a climate-damaged planet? How would such philanthropy work, what would it look like, what would it do differently from now, and how would it change itself in order to justify its continued existence? 

That last question is not meant to be rhetorical. The time frame for irreversible climate collapse is now about the length of time an American child spends in elementary school or just barely longer than the term of an elected Senator. The time frame for harms from badly designed AI to manifest has passed, it's already underway and we're well down that path.

We're on the path to both realities. We can see them up ahead and are already experiencing the harms we know will grow. It's illogical to do things the way they've been done during a transitional moment, unless your goal is to maintain the status quo. I've yet to meet the foundation or philanthropist who (explicitly) states such as their goal so this should be a time of tremendous experimentation and hopeful innovation. I'd love to see it - please point me in the right direction.

Author: Lucy Bernholz
Posted: May 11, 2023, 7:22 pm

                                                                                    Eileen Pan on Unsplash

There's been a lot of writing over the last three decades about the blurring of boundaries between nonprofits, governments, and markets. 

These analyses usually focus on the use of profit-generating tactics by nonprofits (blurring them with market institutions), the growing involvement of nonprofits in public policy (usually discussed either in terms of dark money or organizations with multiple tax statuses such as c3s and c4s), and the use by governments of philanthropy-style incentives (e.g. prizes or matching grants) or direct government involvement in supporting specific companies the way investors do. The whole social enterprise movement is an example of blurring lines between philanthropy and business.

In this context, this story of a government "watchdog" group is fascinating. The article describes how every inquiry into the group by a reporter is met with a different classification claim. Starting in 2021, it claimed to be a 501 (c) 3 nonprofit, then it removed the 501 (c) 3 part, then in 2022 it referred to itself in a lawsuit as “an unincorporated association of retired and former public servants and concerned citizens that is dedicated to restoring public trust in government.” And then, in January 2023, it labeled itself simply “a collection of individuals.”

Needless to say, whatever it is, it has not been filing any paperwork or tax documents that might explain who is involved and where the money is coming from. 

Just in time for the 2024 U.S. Presidential election, which we can expect to be defined by AI generated information warfare online, we should also be on the lookout for more of these "deepfaked" IRL organizations.

Author: Lucy Bernholz
Posted: May 9, 2023, 9:55 pm
Nonprofits and foundations have been slow to realize that they live in a world of dirty tricks, bad faith messaging, trolls, DDOS attacks, and data breaches. In other words, they inhabit the same internet the rest of us do - one where determining what information to trust is almost a full-time job.
Industry intermediaries - organizations that provide information about nonprofits and foundations to the public - live in this same world. Not only are political groups setting up nonprofits as fronts for political money laundering, they are using the nonprofit information infrastructure to help them spread falsehoods, to slap back at organizations opposed to their views, and basically every other trick of online information harassment.
Here's a picture (a screen shot on my end) of what I assume is a hacked and defaced organizational page on Candid - the biggest provider of data on nonprofits and foundations. 

                                Screenshot taken from Candid website, 3:30 pm pst, May 2, 2023

That's the home page of an organization called American College of Pediatrics - an anti-gay, anti-trans lobbying group. Clearly, someone doesn't agree with their views.* The group also lost 10000 records from a Google drive it left online and unprotected - information leaked includes all kinds of donor and member information.

The internet and world wide web are trash piles of information. AI systems, such as ChatGPT, spew statistically-produced baloney. All of them are readily designed to facilitate harm and lies. It didn't have to be this way, but it is. Many people are clutching their pearls, having ignored the insights and warnings of those who've been pointing out these harms for decades. 

It doesn't have to continue this way, though it seems to be doing just that. Glory and hysteria go in cycles - from app to app, crypto to GPT.  Despite all the warnings of a cliff ahead, we seem to be driving faster and faster toward it.

*FWIW, I don't agree with their views. I did not deface their page on Candid.

Author: Lucy Bernholz
Posted: May 2, 2023, 10:59 pm

Today, reporters tell us that staff at X Corp. (the company now responsible for Tw*tter) are going through its database and removing - one by one - emergency services that subscribe to the platform's API - thereby cutting off these departments (fire, emergency services) from using the social media service in emergencies.

GLAAD discovered yesterday that Tw*tter had suspended its efforts to protect transgender people, deliberately removing language in its Hateful Conduct Policy that penalized misgendering and dead-naming.

The Chronicle of Philanthropy reported today that nonprofits are sticking with Twitter, despite....everything going on over there at the company and on the platform.

Hmmm. Why are Niemöller's's words ringing in my ears? You know the ones: "First they came for the..."

Author: Lucy Bernholz
Posted: April 19, 2023, 6:30 pm

                                                                                                    Photo by Mike Scheid on Unsplash

Colorado privacy law INCLUDES nonprofits - 

This is as it should be. Nonprofits gather, hold, and rarely protect an enormous amount of very sensitive data of very high value. Think about it - your donations to, volunteer time at, and service from a nonprofit says a LOT about you - much more personal information than your favorite ice cream flavor. Marketers and politicians LOVE this information. And they use it to even further segment and divide us.

Philanthropists need to step up and help nonprofits protect their data and the whole sector needs to massively improve their data governance and protection processes OR stop collecting data OR stop lobbying their way out of accountability.

Author: Lucy Bernholz
Posted: April 13, 2023, 9:14 pm

                                                        Photo by Jason Dent on Unsplash 

I changed my job in response to the Citizens United decision by the Supreme Court of the United States (SCOTUS). I was convinced at the time (2010) that the Court's decision would lead to the transformation of many nonprofits from advocacy organizations to money laundering tools for political donors. I was right.

It's been hard to prove the scope of this for the very reason it's happening. Nonprofit law allows for donor anonymity; campaign finance law does not. By using nonprofits to "wash" their names from political donations, it makes it very hard to track money back to its source. The amazing web of connections that Jane Mayer drew out in her book Dark Money and ProPublica documented here shows how hard this can be. These concerns were part of what led Rob Reich, Chiara Cordelli and I to write Good Fences: The Importance of Institutional Boundaries in the New Social Economy (2013).


The rules on donor anonymity that come from the nonprofit sector have proven to be remarkably adaptable tools for "washing" donors' names from political contributions. This can be done by moving money from a c3 to a c4. It can be done by opening and closing a c3 or c4 in-between the required reporting periods. It can be done by creating layers of relationships between c3s and c4s and crowdfunding platforms. It can be done - and is being done - because the laws about nonprofits (and the regulators of them - state attorneys general, the Internal Revenue Service (IRS), and, in the case of Florida, the state Department of Agriculture & Consumer Services) intersect somewhat orthogonally with the laws about elections and political donations (and with the FEC and state level oversight bodies).

What's worse, is that Citizens United was only a point on a path. There are trend lines that can be spotted and forces identified working very hard to further dilute any distinctions between charitable anonymity and political anonymity. Today, in an article by Rick Hasen, an election law expert, I read that we are heading toward:

"..a world in which many of the remaining regulations of money in politics could well be struck down as unconstitutional or rendered wholly ineffective by a Supreme Court increasingly hostile to the goals of campaign finance law and extremely solicitous of religious freedom."(fn)

I can't quote more of the article - and shouldn't have quoted that much - as the article is in draft form and was discussed at a conference celebrating Professor Ellen Aprill. (Grateful to the blog post by Gene Takagi that led me to the event). You can download the draft paper here

In a nutshell, Professor Hasen uses Professor Aprill's work to show the intellectual and legal history that will likely use religious freedom to deregulate political donations. How? Via the deregulation of political activity in churches and houses of worship. There's much more to it (read the paper) but that gets us started. 

What does this mean for nonprofits? More politics. More money laundering. Less trust. 

What does it mean for democracy? More blurring of boundaries between nonprofit and commercial corporations. More anonymous money in politics. Less trust. More plutocratic control. 

It's not a positive tale. But thanks to Professors Aprill and Hasen, we've been warned. So, what are we going to do about it?

(fn)Richard L. Hasen, Nonprofit Law as the Tool to Kill What Remains of Campaign Finance Law: Reluctant Lessons from Ellen Aprill,"Forthcoming, 56 LOYOLA OF LOS ANGELES LAW REVIEW (2023) (special festschrift symposium honoring Ellen Aprill)

Author: Lucy Bernholz
Posted: March 30, 2023, 8:58 pm


                                                                    Photo by Viktor Talashuk on Unsplash

Civil society organizations are on the front lines of advocating for or against the most divisive issues in the United States. The following list is organized by rights. The links are almost entirely to civil society organizations fighting to protect the rights to free expression, free assembly, voting, reproduction, and work. Their civil society opponents on these issues are noted under each section.

(I'm sure there's more to add here - feel free to send additions to or comment below)

Book bans, educational censorship and attacks on free expression

Pen America reports there have been 86 state bills proposed that would censor a wide swath of educational materials and ban books, mostly on Black people, LGTBQ+ people, and discussions of critical race and queer theory (college level). An increasing number of these bills allow a single person to request removal of any number of books, and for those books to be removed before any kind of review. Thirty-two states and more than 150 school districts have implemented book bans.

        Notable nonprofits for book bans:

Moms for Liberty, formed in 2021, has 200 local chapters. It is both a c3 and a c4. Other national groups with branches include US Parents Involved in Education (50 chapters), No Left Turn in Education (25), MassResistance (16), Parents’ Rights in Education (12), Mary in the Library (9), County Citizens Defending Freedom USA (5), and Power2Parent (5).

Another 38 state, regional, or community groups advocating for book removals appear unaffiliated with the national groups or with one another.

        Notable nonprofits against: PEN America, American Library Association, many others

Protest bans and attacks on free assembly

Thirty-nine states have passed laws limiting protest. While a handful of jurisdictions have passed laws limiting the use of facial recognition by police, most places have not done so. In 2021, half of the 42 US federal agencies that are part of law enforcement owned or used facial recognition technology. Corporate use of SLAPP lawsuits against individual protestors are rising in numbers. Open carry laws for handguns exist in 36 states and you can carry a long gun openly in 44 states. Guns at protests are hard to square with the idea of peacable assembly. 

Notable organizations promoting protest bans: Police associations, Republican officials,

Notable organizations fighting against them: Civil Liberties Defense Center, ACLU, BLM


Voting rights

As of 2021, nineteen states had passed laws making it harder to vote. Eighteen states were carrying over 152 bills to restrict voting in 2022. 

Notable organizations promoting voting restrictions: Americans For Prosperity, Heritage Foundation, ALEC

Notable organizations fighting against restrictions: Voting Rights Alliance, ACLU, some election administration groups, Fair Fight, Brennan Center, Lawyers Committee for Civil Rights

Reproductive rights

Have split the U.S. in two - with 24 states banning access to abortion. These states are also adding vigilante bonuses and surveilling communications and travel.

Notable organizations promoting reproductive restrictions: see this list

Notable organizations fighting for access to healthcare: see this list

Right to work

These laws, whose name implies one thing but which actually focus on restricting the right for labor to organize, exist in 27 of the 50 states.

Notable organizations promoting voting restrictions: Americans For Prosperity, Heritage Foundation, ALEC, Republican Party

Notable organizations fighting against restrictions: AFL-CIO, SEIU, Center for American Progress, Democrats

Behind all of these organizations are donors. Some are heavily supported by individuals, others by foundations, others by corporations. Many rely on crowdfunding or on a mix of all of these funding structures. Behind each issue, on each side, is a mix of 501c4 and 501c3 organizations - an approach that makes it easy to hide the identities of donors whose interests are primarily political but who desire anonymity. New case law on donor anonymity in such situations, and conservative groups efforts to enable even greater anonymity for political donors, further complicates our ability to know who is funding what. 

I don't have a conclusion to offer to this post. Yet. Instead, view this as "first draft thinking" for Blueprint 2024. I welcome your feedback.

Author: Lucy Bernholz
Posted: March 29, 2023, 7:59 pm

Photo by Bram Van Oost on Unsplash 

If you’ve been reading the news you know that Silicon Valley Bank (SVB), a bank that heavily caters to VC firms and start ups, collapsed and its depositors are being saved by the US Treasury. You know that hearings are being called for in Congress and the same old battle lines between the wealthy (people and institutions) and everyone else have been re-animated. And you can infer that there was (and is) a whole lot of backroom-ing going on.

You also know that SVB had lots of money in accounts held by nonprofit organizations, including affordable housing organizations. 

You also know that Open AI, the once not-for-profit-now-for-profit research organization has released GPT 4, a large language model (LLM) update to its previously (as in three months ago) released GPT 3. You’ve heard about generative AI, read stories about how “nasty,” “smutty” or “just weird” the outputs of the GPT models are, and you may have even “played” with or worked with these models. On mastodon I found a thread of nonprofit staff sharing stories of how they’re using ChatGPT to expedite the funder-driven time-suck of cutting the 1000 word description of your programs and their world changing effects required by Funder A into a 300 word description for Funder B. 

And you’ve probably seen, perhaps read, maybe skimmed the numerous articles and abundant research on how the LLMs are biased and the outputs are “hallucinations,’ (yes that’s what they’re called). As for SVB, you may have seen stories or tweets or blog posts about how the collapse of SVB will lead to an immediate funding disaster for all Bay Area nonprofits.

I want to posit two things. First, jumping to insights or conclusions right now about the effects of either the bank collapse or generative AI makes for good Twitter (if there is any such thing anymore), but isn’t reality. It’s punditry, lobbying, or sales. Second, think about the intersections between these things - emerging tech systems, corporate hype, cost of living, need for and role of nonprofits, risk management in banking, risk denial in corporations, risk and responsibility of governments, philanthropic product choices by wealthy individuals (DAFs, LLCs, private foundations, community foundations) and, finally, the overlap between these categories in terms of actual number of people involved. 

It’s too soon to know how these things will play out at a sector level. Those on the outside of SVB and/or OpenAI don’t know as much as we think we know. We don’t know all the ways they intersect. The best anyone can be doing right now is 1) finding out if they have exposure to SVB or Credit Suisse, either directly or through their funders (true for startups and nonprofits and mitigate appropriately at the organizational level; and check on your own bank, given potential for ripple effects of individual bank problems; 2) Put on your hype-goggles, convene your nonprofit’s data governance review committee (What? You don’t have one?) and start thinking now about who generative AI helps, what it does well and where it is dangerous, if and how it aligns with the mission of your organization (The mission - not the development or marketing departments' metrics, but the actual mission), where (within what software you use) are algorithms already at work, and what data (on whom) you’d be feeding to a third-party corporation (such as OpenAI) if you start using it and what that means for your constituents. 

These two things - a bank collapse and new technology - ARE likely to have BIG societal impacts. But understanding them will take time. And their impacts won't unfold along "straight lines" from A to B. There will be all kinds of additional "developments," intersections and interactions between impacts, and mitigations and responses. Don't fall for the quick analysis - it’s all operating on incomplete information.*

Just like the weather in California, judging from the winter we’ve had, forecasters (armed with actual meteorological and longitudinal data) are noting that we’re in for a long, strange Spring. That’s about all we can guess is coming from these two recent events. Strange times ahead. Keep your goggles on.


 *Speaking of incomplete information, Time Magazine is running a story describing how some of the biggest names in Effective Altruism knew about the financial shenanigans of their most famous, duplicitous member, Sam Bankman-Fried. Yet, they were still "shocked and dismayed" when his crypto-empire turned out to be built on fraud.

Author: Lucy Bernholz
Posted: March 15, 2023, 9:16 pm


                                                            Photo by Kier in Sight on Unsplash 

Proud to say this article, "Digital Public Policy: New Priorities for Nonprofits" has just been published. It is derived from lessons learned preparing the Integrated Advocacy report and this article on media coverage of civil society and covid

My co-authors, Toussaint Nothias and Amelie-Sophie Vavrovsky and I outline the many ways in which civil society is now bounded by and dependent on the many public policy domains that shape digital spaces. 

The most basic distillation of the argument is this: civil society is where we express ourselves, gather together (for non-market, non-state activities) and take collective action, often contrapuntally to the "mainstream" actions of markets and governments. In our times, most acts of expression (or mere communication) and gathering are dependent on information exchanged digitally. Just as digital practices and public policy shape online expression and assembly, civil society also shape digital practices and policies. They are entwined with each other. Whether we are considering public policy decisions about privacy, expression, assembly and association or considering regulations about philanthropy, nonprofit structures, and protest or free expression we are talking about enjoined systems.

You can download a copy of the article here. (hope this is not paywalled)

P.S. Thanks to everyone who has reached out to me after receiving these blogs posts/emails and offered good wishes, hoping that the return of the blog indicates an improvement in my health. I wish they were directly correlated. In fact, my return to blogging is motivated by the destruction of Twitter. I am chronically ill and disabled by Long Covid and am blogging when I can.

Author: Lucy Bernholz
Posted: February 27, 2023, 2:25 am

                                                                        Photo by Jordan Rowland on Unsplash

Earn to give. 

Make as much money as you can to give it away. 

Why are we surprised that messages like this would provide incentives for people (or be used as justification by people) who just want to make lots of money? 

This story from The New York Times, seems at first as if it will pull back the curtain on this logic that making money at all costs is OK if you're going to give it away. But, it doesn't. Instead it joins the legions of articles written about effective altruism and the potential crimes at FTX that inherently reify the logic. 

Rather than the FTX debacle unleashing a broad conversation about wealth and responsibility, philanthropy's roles in making amends for harmful actions, or *gasp* real questions about capitalism and justice, the FTX scandal is philanthropy's version of asterisked hall of famers. Those involved in FTX are being treated like the Pete Roses and Barry Bonds of philanthropy. The more that stories about FTX repeat these tropes about effective altruism, the more they reinforce it as an excuse, a justification, even a reason for fraud.

Philanthropy - and here I'm talking about big philanthropy, institutionalized and with extraordinary resources - has been a tool for cleaning up reputations (of individuals, corporations, and whole industries) for a long time. Philanthropy as an acceptable pre-condition for malfeasance is the throughline to much of the press coverage on FTX. 

What's notable is that the press I've seen calling out this problem is that which quotes other effective altruists or those who disagree with it's underlying philosophy. Other parts of organized philanthropy haven't had much to say. And that says a lot.

Author: Lucy Bernholz
Posted: February 21, 2023, 8:34 pm


                                            Photo by Edge2Edge Media on Unsplash

You knew I was going to have to do it. So here it is (Courtesy of ChatGPT Feb 17, 2023)

The first line is the "prompt" I typed into ChatGPT. The rest of the text are the answers it provided to me

Answers such as these don't bode well for small community-based groups. The AI doesn't overemphasize, but does include, "overhead" concerns as it does "outcomes." First answer promotes aligning your giving with your values (fine), then it goes on to suggest organizations without concern about what my values might be.

Looks like a #buzzword-trained AI to me.

Author: Lucy Bernholz
Posted: February 17, 2023, 7:32 pm

                                            Photo by @chairulfajar_ on Unsplash
Today's New York Times features an article on how tech companies are dismantling their Trust and Safety Teams (free article). This strikes me as akin to price gouging by oil companies during the inflationary moment we're in - taking advantage of the great tech-layoff-contagion to get rid of something they don't seem to have ever really wanted. 

Let's just acknowledge that we can't trust anything posted on social media (and the most vulnerable, the  most outspoken, and  the rest of us are all facing more harm). We can't trust the answers from ChatGPT and the tech companies are racing each other to implement similar AI systems into their search products and elsewhere. At the same time, the companies are less and less interested in making any data available to independent researchers who might check the companies' own claims. There are lots of efforts to ensure access - the EU's Digital Services Act, proposed legislation called the Platform Accountability and Transparency Act in the US, and the Coalition for Independent Technology Research - but none are perfect and all must reckon with serious concerns about people's privacy.

I've noticed an uptick in my email of research reports from nonprofits and advocacy groups. I suppose this makes sense in a time of continued pressures on journalism and the swamp of bad information that is the internet. How should we know to trust these reports? Chances are each of us will only receive such reports from organizations with which we're already aligned or organizations that have bought email lists from other organizations with which we're aligned. That sets us all up for an ever-growing pile of one-side-ism. 

I'm pretty sure I've never ever received a report from an organization that criticizes the organization or its outcomes. Occasionally, I receive one after a scandal in which the report guarantees me the problem has been solved. I have received some self-searching emails about the claims of sexual harassment in the Effective Altruism community, by people in the community (I am not in it) but those are about "culture" and "governance" not the work itself so much.*

Here's my question for nonprofits and foundations and activists and associations - to civil society, basically - how do we trust you and your research?

This is a sector-wide issue. What mechanisms, credentials, cross-checks, editorial practices, industry norms need to be developed and implemented before civil society's signals become indistinguishable from incessant noise?

*Kelsey Piper, who identifies as an effective altruist, has a decent example of soul-searching about EA and harassment in her newsletter dated February 15, 2023 for Vox.  Although she nods to the homogeneity of the EA community she doesn't draw any further inferences to the problems in its giving approach, governance, or harassment.

Author: Lucy Bernholz
Posted: February 15, 2023, 9:22 pm

I wrote about Grift - Gifts here. I didn't think of the term soon enough to include it in the Buzzwords list for Blueprint 2023 - which is now available. Get your free copy here.

In The New York Times print edition, December 15, 2022, the following ad ran on page A7. I don't know where else it ran.

It's a full paid ad (estimated cost is at least $150,000), from the Stellar Development Foundation, arguing how fraudsters (e.g. Sam Bankman-Fried of #FTX fame) are giving crypto a bad name. It goes on to describe how the blockchain is facilitating direct cash transfers across international borders and without banks - enabling aid to people in Ukraine who are under attack. (it leaves out any mention of how this also allows movement of money for other purposes as well, from sanctions avoidance to money laundering to weapons/drugs purchases and, of course, support of the invaders).

Good. It's beyond time where we had a real discussion about the good and bad of crypto. Not a hype-fest or a crossed-arm, "it's all bunk" argument, but a determinative discussion about if and how it can be used to help people. And what the RULES need to be to eliminate the centralizing, wealth extracting, environment destroying aspects of it - IF its going to be used. 

Differentiating grift-gifts from "crypto-philanthropy" is going to come down to the regulations and oversight mechanisms built around the technology and the groups participating. It's about the legal code, not the software code. 

Author: Lucy Bernholz
Posted: December 15, 2022, 7:11 pm

It's beginning to look like SBF (Sam Bankman-Fried) and FTX (crypto currency exchange) are going to go down in history as the biggest case of "philanthropy-washing" in history. So big and important, I've coined a new word, Philanthro-grift.*

The U.S. Department of Justice has arrested SBF and charged him with intent to defraud investors. The court-appointed clean up CEO of FTX told the U.S. Congress "This is just old fashion embezzlement, taking money from others and using it for your own purposes. This is not sophisticated at all." (<- that's a direct quote from Dave Pell's NextDraft newsletter. You should read it).

Here are the implications for  civil society and philanthropy.  

  1. FTX and SBF spent a lot on "philanthropy" - especially organizations aligned with             #EffectiveAltruism.  All part of his grift (alledged)
  2. If the whole crypto industry is a sham (as many think), what the heck could possibly be valid about crypto giving and philanthropy? 

Crypto is bad for the environment, a sham of a financial system, and - at least in SBF's case - was part of a major grift. How do civil society organizations justify being part of any it? Really, takes philanthropy-washing to a new level.

I think someone could start a "philanthropy is going great" website (modeled after Web3IsGoingGreat and TwitterIsGoingGreat).

Final thought - SBF's grift extended to political contributions (some public and some dark money). We need new rules.

*I also considered "grift gift". (Grift giving) Which do you prefer?

Author: Lucy Bernholz
Posted: December 13, 2022, 9:10 pm