Content from my Philanthropy 2173 Blogspot feed

PHILANTHROPY 2173

The future of good

FastCompany Magazine "Best Blog"
Huffington Post "Philanthropy Game Changer"

I'm finding it difficult to focus and track time. From what I gather, so are many others. These are (simultaneously) problems of the fortunate, signs of grief, life-altering though not threatening, and opportunities.

These days I'm not just writing and teaching about digital civil society, I'm watching as people all around me come to realize they're living it everyday. Some things I'm seeing -

Video conferencing is rapidly morphing into distributed broadcasting - I've attended hosted interviews with authors, a discussion on public policy among African American scholars, a comedy show, and a city council meeting - all essentially broadcast on video conferencing software. Some were free, some were paid. All were more interactive (thanks to hosts, moderators, and chat functions) than simply watching television. Some of them used several screens (the broadcast, the chat function, social media) incredibly well - incorporating audience questions, weaving social media comments into the chat and responses from the panelists back out into social media spaces. Some foundations are getting into the act. I'm sure nonprofits are doing the same.

Teaching via video conferencing allows me to connect across classes, schools, and disciplines. I've taught two classes simultaneously via one video call. I'm sitting in on seminars at universities in other countries. Doing this is one thing; doing it well is another.

Video conference virtual backgrounds are the new protest "banner." See this from Stanford faculty senate meeting (Purple backgrounds were demands to pay contract workers)
My email inbox and snail mail inbox look like December - fundraising appeals are flooding in - some from nonprofit organizations and as many from bookstores, restaurants, cafes, and other local businesses (the ones that were supposed to get the Congressional money - which seems to be going mostly...elsewhere). Donation requests from commercial companies - please take note of this.

There is no bottom to how data-sucking companies will use this pandemic for their own ends - Palantir has been awarded federal contact tracing contracts, Facebook is making "grants" that claim all the "grantees" data, and "contact tracing" apps are a scourge of this scourge.

Meaningful community organizing is really hard under shelter-in-place conditions. Governments are using the pandemic as an excuse to ban gatherings. How we come together for political and civic action during and after this is of critical importance - and being invented in real time now. It's not just about protest and organizing - it's all of civic and political gathering. Micah Sifry is thinking about this; we're working on it the DCSL; join us.

There's hope in human networks of care. 
Author: Lucy Bernholz
Posted: April 23, 2020, 10:48 pm
By now you've probably seen at least some of the news about Zoom, the video conferencing program that has quickly become a verb, is full of security and privacy risks. It is sharing/leaking data to other companies (Facebook), being investigated by Congress and Attorneys General, not really end-to-end encrypted, and so on. It's also become a home for "networked harassment," as Joan Donovan puts it - zoombombing by racists, sexists, and other creeps (insert stronger word here).

You, who may have spent the last 21 days living on Zoom - for work, play, family, and exercise - may feel like putting your head down in resignation at this news. (Go ahead, I'll wait).


But we can also use this to really understand what it means to be digitally dependent. You've heard me refer to our digital technologies - the software, hardware, and telecomm choices we make - as our new landlords. We "rent" space from them and they set the rules. We "rent" space from them for our emails, our text messages, our video calls, our cloud storage, our shared documents, all of it. And, let's be really clear, unlike our landlords in physical space, the space we rent from our digital landlords is all open-floor plan and the landlord sits in on every meeting, taking notes that he keeps for himself (and uses for his own purposes).

With that image in your mind, your best bet for protecting the information you care about from your digital landlords is to assume they are listening, taking notes, and profiting from those notes. In the real world, if you could see your landlord sitting there at every meeting and on every call, you'd adjust your conversations accordingly. In the short term - the same is true with your digital landlords.  

In the bigger picture, you can follow, support, and get involved with your civil society peers who monitor the digital landlords and keep the pressure on them and regulators to change the way these tools work. That's digital civil society in action. 

For more on zoom see this from Joan Donovan, these from doc searls, and there's been good media coverage from Vice, the Verge, The Intercept.
Author: Lucy Bernholz
Posted: April 2, 2020, 4:20 pm
I've been arguing for several years now that civil society is dependent on digital systems, which are not neutral, designed with civil society in mind, or inately democratizing. Our dependence on digitized data, commercial software and hardware, and global communications networks lays bare the fallacy that nonprofits/foundations, donations of money, community organizations, political activism, informal associational life, mutual aid networks, kinship care - any of the activities that take place in civil society - are independent from market or government forces. Not only are they not independent, they're entirely dependent.

Those dependencies change the nature and boundaries of civil society, require us to revisit old assumptions (such as the idea that the sector is independent from markets or governments), and we need to build new technology, organizational practices, and attend to a different set of policy domains in this dependent stage.

This is the entire premise of the Digital Civil Society Lab.

This has become painfully clear to millions of people in the last weeks as they've tried to work remotely.  The first step to doing so - after caring for family and finding a place and time to sit down - is to figure out how to get the tech to work. Whether it's using conference calling software safely, figuring out where the "mute" button is, accessing the office server, using work email on your own phone, getting hot spots and functioning laptops to your staff - chances are you've been dealing with "work tech" lately in ways that make it painfully clear: your work depends on digital systems.

As the authors here put it,
"The pandemic also lays bare the many vulnerabilities created by society’s dependence on the internet. These include the dangerous consequences of censorship, the constantly morphing spread of disinformation, supply chain vulnerabilities and the risks of weak cybersecurity." 
 Laura DeNardis, one of the authors of that article, has a book called The Internet In Everything.

That's another way of saying civil society is digital civil society.

We've written about the new kinds of policy advocacy that digital civil society demands.

We're working with partners across California to help with nonprofit's organizational capacity needs that start from these digital dependencies.

We're amplifying and hope to partner with others doing similar work.



Author: Lucy Bernholz
Posted: March 27, 2020, 3:51 pm
Flatten the curve

Shelter in place

Mutual aid

Caremongering

Zoombie

PPE

Physical distancing, social solidarity

zoombombing

Sigh.

The list just goes on and on. On the one hand, these terms don't seem to have anything to do with philanthropy or digital civil society, the topics of this blog. On the other hand, they have everything to do with those topics.

Wishing you health, care, and connection. 




Author: Lucy Bernholz
Posted: March 23, 2020, 4:58 pm
A rebirth of mutual aid. Phone trees. Grocery pick-ups for your neighbors. Pooled funds for loans to cover unexpected expenses. Sharing physical goods within communities.

Many things that defined philanthropy before it became a formalized, industry unto itself (early 1900s in the U.S.A.) are coming back into fashion.

This list from Allied Media Projects has some Michigan-based examples of mutual aid.

Here's a list of collective care opportunities.

On March 1 I delivered a manuscript to my publisher called How We Give Now. In it, I noted the signs of this kind of rebirth - so it's amazing to see this happen at such scale in a short few weeks. But all the signs were there - connectivity, familiarity with direct giving (due to crowdfunding), cultural traditions of mutual aid that never went away, they only stopped being seen by the formal "counters" of philanthropy.

The manuscript also notes how exclusionary (discriminatory? racist?) our current built system of tax policies for nonprofits and philanthropy is - the legal and tax incentives that define the "nonprofit sector" in the U.S. have always privileged wealthy, white, Judeo-Christian norms and practices - and we've counted those behaviors as if they were the whole of giving. They never have been and they're not now. It's good to see long-called for changes like a universal charitable tax deduction finally getting attention, although no one wants a crisis to be the reason.

This policy change would be more inclusive than what we have, but it is still lacking in both imagination and consideration of how we give. It still assumes that "philanthropy" is about giving money to a privileged type of organization and that our giving is shaped by tax incentives (after Trump's 2017 tax giveaway the percentage of people who itemize their tax returns - and thus benefit from tax deductions - dropped from about 25% to less than 10% of tax filers. In other words, current tax deduction policy does not matter to more than 90% of Americans)

Mutual aid and direct giving are two very visible signs that the assumptions driving public policy change are not based in an understanding of how we give now. Consider policy ideas to boost giving that started from where a lot of it happens - online - and build from there. Privacy laws and protection from fraud. Broadband access. Encryption. Consumer data protection. Control over who sees what we're doing and with whom. These are the kinds of policy domains that matter to how people give using payment apps, donate now buttons, online platforms. 

In the summer and fall of 2019, as I was writing the 2020 Blueprint, I made two predictions. One was that the year would bring a global recession. The second was that said recession would reveal the fragility of our built institutions, including nonprofits and foundations. That fragility has been hidden by a multi-year story that conflates a booming stock market and gross corporate profits with the lived economy of people.  New institutions will emerge, but the transition will be bumpy. I'm sad how right I was on the first; we'll see if I am correct on the second.

The global health crisis we're now in is going to peel away many such vanities. Buried in this crisis is the opportunity to re-examine what we need from civil society and philanthropy and how we get it. What do we need in order to be able to voluntarily connect and use our private resources (digital and analog) for public benefit? How will we control our associational choices in an age of platform dominance (which already has done so much damage to our control of expressive and private spaces?) It's time for first principles; time to ask ourselves about the basics of civil society, aid, altruism, philanthropy, democracy; back to our assumptions about participation and membership and equity and justice.

Author: Lucy Bernholz
Posted: March 20, 2020, 7:43 pm
Italians singing from their balconies.

Spaniards doing calisthenics, led by one trainer on rooftop.

Peruvians thanking doctors - and me watching it in California.

Penguins walking down stairs in an empty (of humans) aquarium

Virtual museum tours

People sending cash directly to strangers in need, via their phones.

Governments expanding their surveillance capabilities under the guise of public health, but neatly omitting any plans to "turn it off" when situations change.

Online class recommendations about racist robots, shared from one faculty to another. 

Global, distributed professionals who work for every kind of firm from commercial to nonprofit, universities to tech firms, independent contractors and professional service providers suddenly all dependent on one single digital platform (zoom) all day, every day, effectively bringing much of the data on their phones and laptops onto one company's servers (and no, you didn't read the privacy policies). Yep, you're working in zoom's world now; your data is theirs, your chat and conversations and data - theirs.

A social media connected world without human content moderation.

Gaslighting everyday, from every direction, ignoring the digital video and audio archive and preparing to call it all "deep fakes."

Deep fakes.

Friends telling stories over video chat to their neighbors' children to give the parents a wee break.

Churches using sms to deliver sermons; while volunteers stand in the street to feed the hungry.

We all live in digital civil society now. We have been for years, but it's taken many people a long time to catch on. In January, the Digital Civil Society Lab released a report about how this is true, why it matters and what to do about it. why public policy on the digital environment - from rights to automated decision making, AI to biometrics, broadband access to zero rating - were the policy domains that matter to civil society now. Those rules shape, bound, govern, and disenfranchise us as people when we try to come together to do things for the public good. Those rules shape how philanthropy works, where nonprofit data lives, who can protest and who cannot. There is no civil society without digital rights.

Digital policy and practice shapes civic space.

Get involved. Your relationships, your neighbors, your community, and, yes, your democracy (if you have one) depend on it.
Author: Lucy Bernholz
Posted: March 18, 2020, 2:36 am
I'll get back to writing...soon.

On March 1, 2020,  I  delivered a manuscript to a publisher for a book called How We Give Now. It is scheduled to be on bookshelves in Fall 2021. So, I've been in a time warp for some time. Ask yourself this question - how will we be giving a year from now? and then try to write a book about it now. It's a whiplash in time. Amazingly enough, chapter 10 of the eleven chapters is a thought experiment on giving in disasters - guiding people through the plusses and minuses of all the ways we give our time, money, and data. (I'll no doubt be updating that chapter when the manuscript comes back from peer review).

These moments, with rapidly changing information and intensity of emotions, spark familiar feelings, fears and anxieties for many of us, and are brand new for some. Some people are functioning well now, others aren't; and we are likely to change places on that spectrum many times in coming days and weeks.

So, here are a few wonderful crowdsourced resources for nonprofits, virtual work, and tech that I have found and wanted to share.

Hats off to those who are curating these, thank you. Feel free to suggest others.

Many things nonprofit-related:
https://bloomerang.co/resources/covid19/

From Beth Kanter, Janet Fouts, Linda Baker, Sarah Goddard, Susan Tenby, Wendy Harman, Meico Marquette Whitlock, Barbara O’Reilly, Jessie Mooberry, Farra Trompeter
https://docs.google.com/document/d/1k5pC-R1V4SK4bRPN7cqh9WpIxubXP5tt8qpF5hinMoQ/preview

Virtual meeting facilitation from Beth Kanter
http://www.bethkanter.org/facilitate-virtual-meetings/

Digital tools for churches: 
https://docs.google.com/documen/d/1fYN5QW1QnA6ofCLNdUNeSMARI44anqTsEAoXu7mRwFE/edit#heading=h.ordbwq6yk5y

Coronavirus tech handbook:
https://coronavirustechhandbook.com/ 

Nonprofit finance book (NFF)
https://nff.org/blog/covid-19-what-nonprofits-should-do-right-now

Fundraiser Sarah's tips
https://www.fundraisersarah.com/post/stay-strong-the-fundraising-community-pulls-together

Ontario Nonprofit Network
https://theonn.ca/nonprofits-on-the-front-lines-of-covid-19/

 Please, take care of yourselves, your loved ones and your communities. If you can help someone today, do it. Stay physically distant but socially connected.


Author: Lucy Bernholz
Posted: March 17, 2020, 6:26 pm
Professors Rob Reich, Mehran Sahami, Jeremy Weinstein, along with Hilary Cohen and Mohit Mookim, sent the following letter to students last week. Posted by permission.
******
The crisis that is unfolding around the world with the coronavirus pandemic is frightening. And it appears likely that things will become worse before they get better.

We can only compare the mood of anxiety and uncertainty to what we experienced on 9/11 and during the financial meltdown of 2008. It is the sort of event that has enormous consequences for politics and economics, some that we can see plainly and others that we cannot yet anticipate. It is also the sort of event that etches itself into individual and collective memory. These coming months will be a defining feature of your life’s story and of our societies. In the future we will talk about the Coronavirus Pandemic of 2020.

None of us has any expertise to offer on the science of coronavirus or appropriate public health measures. We will not pretend to offer anything useful beyond reinforcing the importance of social distancing and handwashing.

But we need not have any expertise in those domains to offer other kinds of advice. The kind of advice drawn from having lived through other defining events like 9/11 or 2008, and possibly – though only possibly – having a bit of wisdom as a result.

First, the urgent measures necessary to mitigate the effects of coronavirus, and thereby to save lives, will likely generate much turmoil, confusion, and anxiety in the coming weeks. Perhaps months. Planning for the short term future suddenly is very difficult. Will classes resume in person at all? Will commencement happen? If I leave campus, when will I see my friends again? Will I even be permitted to travel?

The answers are unknown. A great many things are beyond our control. We like to remind ourselves that, if this is our first time feeling this sense of loss of agency in our lives, we must consider ourselves lucky — multitudes of people around the globe have to deal with an inability to plan for the future frequently throughout their lives.

Still, it’s important to remember what is in our control. Yes, that means strict personal hygiene, handwashing, and social distancing. But just as important is ordinary kindness for each other and for strangers. Simple gestures of kindness and friendship are always important, and they are even more so during crises.

It is a wicked irony that you can best serve the wider community, and practice empathy for vulnerable people, by reducing your contact with the wider community. The most pro-social thing to do right now is to practice social distancing.

And yet you can make an effort to reach out to people around you and check in to see how they’re doing, how their friends and family are doing, and whether they need any help. If they need help, try to organize help for them.

Second, in that spirit, we recognize that the current situation imposes many hardships on students, especially on those with family members directly affected by coronavirus, those who face financial difficulties as a result of the crisis, and those who have no safe place to go other than perhaps their campus housing. This is a time of heightened stress, and people respond to stress in many different ways – from diving into work to decision paralysis – all of which are normal. And when stress is combined with feelings of isolation and actual social distance and self-isolation, everything is worse.

We strongly encourage you to reach out, virtually, to your closest friends and loved ones. Invest in the communities to which you belong—bring whatever energy you can to them and let them, in turn, be a source of inspiration to you.

We are currently working on several approaches to engaging the Stanford community to assist. One resource that the Stanford community has already developed is this “Community Offerings” spreadsheet (link removed for blog post). In the meantime, share with relevant university leaders any urgent issues concerning the university’s crisis response.

Third, and finally, we have a practical suggestion for you. This is a nearly unprecedented crisis at Stanford and across the country and the entire world. It will be studied in the future by historians, scientists, economists, public policy makers, and more. And as we said at the start, it will be a defining experience of your life.

In that spirit, assuming that you and your loved ones are personally safe and healthy, we encourage you to document your own experience of these days in writing. We are inspired by this tweet from a Yale colleague, Nicholas Christakis:

“If I were a college president & was closing school for a once-in-a-century reason, in order to help build campus community & curate a historical archive, I would set up a system so students could record their experiences, as a kind of individual and collective diary. #PlagueYears”

Start writing today. And if you’re a builder and would like to volunteer to help make a system to facilitate this shared documentation and reflection on a wider scale, contact us.

We close with a reflection on crisis, by Harvard classics and philosophy professor Danielle Allen, who delivered her words as a speech at the University of Chicago on September 12, 2001.

Take care of yourselves and of each other.
 *****
I am glad I know the authors of this letter and grateful they agreed to let me post it. I hope it helps you, also. Please take care of yourselves, your loved ones and your community. If you can do something to help someone, please do. 

Author: Lucy Bernholz
Posted: March 16, 2020, 3:02 am


IMPORTANT UPDATE: Please read this response and thread and my apologies for first using logo without permission. I very much appreciate having this broader perspective brought forth from @CassieRobinson, Stefaan Verhulst and others.


Original post
Since we started the Digital Civil Society Lab I’ve been invited to countless conferences, workshops, and philanthropic or corporate launches of “some kind of tech” for “some kind of good.”


I always say no.  I refer to it as the no "blank for good” rule. The framing is entirely wrong. And it’s not just a little wrong; it’s a lot wrong. Everything about it, from the grammar to the implications for humanity is wrong. And industry’s recent responses – a thousand conferences and initiatives to put ethics into tech – gets the grammar right, but the solution is still wrong.



Here’s a short list of reasons why I ask you to reconsider your tech/data/machine learning/cloud/AI or other technology/computational effort for social good, organized from snarky to serious (and back to snarky):



1)    First of all, enough with the “good” language. If you’re not even willing to do the work of defining what you mean by good, of understanding the specific challenges and strengths of civil society or the public sector or providing a real definition of “good,” then stop right there. There is no universal definition of good. This phrasing is hollow marketing rhetoric for selling a product (perhaps at reduced rates) to civil society or public sector agencies.

2)   Repurposing a technology built for commercial purposes (and that’s what the blanks almost always are) into civil society or public sector work ignores that those other domains have fundamentally different values and purpose – of equal importance to, but different from - commerce. Selling ads and providing justice are two entirely different domains, inside and outside of the algorithm. You can’t simply transfer a tool or way of working designed for commercial purposes to the other systems and not cause harm.

3)   Another level of #2 above: Context matters. In addition to mattering IN AND OF itself, the context of civil society or public service shapes the values, incentives, purposes, motivations, participants, affected populations, and cost-benefit analyses of those sectors. Ignore it, and you get a public sphere full of lies and propaganda, “efficient” decision making systems that amplify racism, procedural approaches that trade fairness for scale, “security” systems that make people less safe, and accelerators for injustice.

4)   Commercial technologies prioritize scale and efficiency. Those values rank below justice, equity, truth, participation, a healthy environment, beauty, deliberation, and MANY other priorities in civil society and the public sector. What commercial applications call externalities, civil society and the public sector exist to address.

5)   The lack of universality in terms of what “good” means applies to ethics, also. There is no universal ethical frame. No “ethics” you can pull down off a shelf and wash over your existing organizational (usually commercial) priorities. A commitment to ethics is a commitment to discuss, debate, decide and enforce a set of values, purpose, conflict, tradeoffs, and loss. A debate about values and implications and choices and bad outcomes and structural recourse for those who are harmed (groups, societies, communities, collectives – not only individuals). Consequences matter – internal and external. If you’re not willing to go there; well then, it’s marketing.

6)   I think the focus on “ethics” in tech has largely become anti-regulatory rhetoric (especially when used within companies). Ethics matter – but they must be specified, debated over, deeply integrated into the people, incentives, and structures of an enterprise, and learned and practiced throughout one’s lifetime. They don’t come on/off like clean room booties. And one officer or office of ethics inside billion-dollar companies? Not enough.

7)     (A return to snark) The framing of “blank for good” is much more honest than its users realize because it doesn’t just imply, it basically declares, that every other use of the “blank” is for, shall we say, not good?



Simply from a grammatical standpoint, it’s a “subject object” problem – but of course, it’s much more than grammar. It’s not that civil society and democratic systems need tech, it’s that tech (and all of the derivate disciplines noted above) needs to be purpose-built for humane values, a commitment to just use, and we need publicly-determined redress and braking systems. There are some things we just should not do.  We need to flip the script; change the subject and object of the phrase. The opportunity cost of not doing so are more hollow promises, more handwaving distractions, and more digital corporate capture and damage to our already deeply broken governing systems and communities.



As public awareness has grown about surveillance, bias, power asymmetries, lack of accountability or due process, and opacity in the digital systems we depend on, we’ve spent the last few years having conversations about ethics in tech. For all the reasons I outline above this is not sufficient. We need to reverse the subject (tech) and the object (good) to be about teaching, designing, building, releasing, and being able to put societal values and just procedures into tech, not putting tech into “good.”



Here are some examples of what we need: Antiracism in AI. Justice in data. Equity in the cloud. Information symmetry in social media/search. Financial resources to support vulnerable people in designing, directing, and governing systems that serve their needs; where communities are  recognized as powerful actors whose needs can stop tech that furthers inequities, is impoverishing to communities and the planet, sacrifices safety, prioritizes individual wealth accumulation over people’s needs, or enables racism, misogyny, hate, and domination.



Please, invite me to the launch of those initiatives, conferences, and philanthropic efforts. I’ll be there;  I have a lot to learn.




Author: Lucy Bernholz
Posted: March 5, 2020, 5:29 pm

Author: Lucy Bernholz
Posted: February 14, 2020, 12:49 am
For the last decade(s) or so, there has been much attention to the blurring of lines between nonprofits and social businesses. Much of the pressure is self-inflicted by funders and nonprofits looking for sustainable revenue models. The rise of impact investing is part of this story. I'm bringing it up just as background - just to say any nostalgic differentiation between businesses and nonprofits need to be reconsidered by looking at the reality of revenue generation practices.

This isn't the blurring that interests me, however. It's on the other boundaries - between charitable (nonprofit) and political. And here I think we have an analogous experience - unfolding as we speak - to learn from. In short my hypothesis is this - the changes in the news media landscape in the USA over the last 15 years portends the future of nonprofit organizations in the country.

It's not a pretty story. It's one of failing business models, new alternatives, a failure of professional ethics and practice to provide distinguishing value, an explosion of choices that consumers don't differentiate between, and a collapse of trust.


https://backchannel.com/the-most-important-law-in-tech-has-a-problem-64f5464128b6#.1nb8s2ovs

https://medium.com/humane-tech/the-online-abuse-playbook-575648c9f798#.hfsw0k34z

http://pressthink.org/2016/12/prospects-american-press-trump-part-two/

https://scholarlyoa.com/publishers/

https://nonprofitquarterly.org/2017/01/04/problem-predatory-journals-fake-academia-joins-fake-news/

https://sunlightfoundation.com/public-whip-count-for-vote-on-changes-to-office-of-congressional-ethics/


Author: Lucy Bernholz
Posted: February 3, 2020, 10:15 pm
Do you work for a "dot org?" Do you know there's an effort underway (currently put on hold by the California Attorney General) to sell the registry of .orgs to a private equity firm?

At a time when hands wring over closing civic space and the many ways technologies are being turned against us, this is the most concrete, least metaphorical example of how to shut down nonprofits I can imagine. Simply raise the rent on all nonprofits by raising the price they pay for their .org listing 2x, 3x, ...10x.

The .org signifier (or top level domain name, as it is known) is no longer proof of nonprofit status, it's use isn't limited to nonprofits, and nonprofit status itself is increasingly slippery as a guarantor of good things, but still, the .org matters. Does your nonprofit or foundation wants to lose it? Can you pay two, three, ten times more for it?

At the Digital Civil Society Lab, we often suggest to foundations and nonprofits that they think of their digital vendors (software, hardware, cloud, apps, mobile phones, email, etc) like landlords. You are doing your work in their space (on their servers, through their systems). The .org designation is (literally) your address.* And your rent is going up.

(This post only focuses on the financial elements of why this sale is bad for civil society. There are bigger, more philosophical and democracy-oriented reasons to care about the sale. It's an easy issue to join in coalition about - and if you're interested in why digital issues like this matter to your work - whatever that might be - read our newest report, Integrated Advocacy: Paths Forward for Digital Civil Society)


*Others also find value in the landlord metaphor - new research here on the "Internet of Landlords"
Author: Lucy Bernholz
Posted: February 3, 2020, 12:00 pm
My next big research project (I hope) will be coordinating a distributed team of researchers from many disciplines to better understand how our digital dependencies influence our associational opportunities and rights. For example:
  1. How do the personas that data-driven algorithms create for us align with, or not, how we see ourselves and how we associate? 
  2. How does platform control of information visibility bound our associational opportunities? 
There are many other questions and I'm working on putting together both a working group of scholars and a more complete outline of the project (Feel free to contact me if you're interested).

As with all of my research, I hope to do as much of this as possible "in public;" gathering, sharing, thinking, revising with interested parties. Here's video of one recent conversation I moderated on what I'm calling (because of the lovely alliteration) Artificially Intelligent Associations. This was recorded at Stanford University's HAI (Human Centered Artificial Intelligence) Conference on Ethics, Policy and Governance, October 29, 2019.

Featured participants are Eileen Donahoe, Terah Lyons, Sean McDonald, and Jasmine McNealy



Author: Lucy Bernholz
Posted: November 22, 2019, 12:43 am
Students often ask me for information on jobs. There's a noticeable (and welcome) uptick in interest in jobs in the public or social sectors at the intersections of policy, analysis, technology, and data. I asked Twitter for relevant job boards and here are the resources I received.*

Internet Freedom Festival Job Board

Tech Jobs for Good https://www.techjobsforgood.com/

Open Gov Jobs

Internet Law and Policy Jobs

Slack channel for jobs via SimplySecure

Giving Compass

Giving Tech Labs

 Ben Green's List - http://www.benzevgreen.com/jobs/ 

New America Public Interest says it has resources

Code For America Public Interest Job Board

Newsletter from Justice Codes often has great job announcements.


I'll update as I can. Let me know what else I missed.

*(No verification, endorsement or even claim that there are shared definitions at work here) 
Author: Lucy Bernholz
Posted: October 17, 2019, 12:58 am
I heard this story last week, from the mother of a toddler.

The kid is home, playing with grandpa. Kid is just past the peek-a-boo stage, now experimenting with hide and go seek. Like most kids of this age, hiding generally involves standing on the other side of a chair or putting a piece of paper on her head. Not really hidden. But Grampa didn't get the message. When it's his time to hide he goes in the other room. Toddler takes hands off eyes, looks around. Doesn't see Grampa. Looks a little bit worried but doesn't move. Waits another minute. Shouts out, "Alexa, where's Grampa?"

I'm going to let you sit with that.

It led us to all kinds of questions. Including about advertising on these devices. Others in the group said it's small, but growing. This article says it's already here and that we (the people) like it.  All of us agreed it seems inevitable.

Question for nonprofits: you ready to pay whatever it will cost to make sure you are the one (and only) response when someone starts asking, "Alexa (others), who should I donate money to?"

Question for the rest of us: You really want some engineered algorithm (no doubt based on who paid the most) telling you where to give your money?

Sigh.



Author: Lucy Bernholz
Posted: June 17, 2019, 10:48 pm
I’ve participated in a lot of conferences, panels, discussions etc. about “nonprofits and AI,” “foundations and AI,” “AI for good”* and so on. The vast majority of them miss the point all together.

It’s not really a question of these organizations using artificial intelligence, which is how every one of these panels approaches it. For most civil society organizations, they may be buying software that’s going to use algorithmic analysis and some AI on a large dataset, perhaps through their vendors of fund development data or software. And then, yes, there are legitimate questions to be asked about the inner workings, the ethical implications, the effects on staff and board and so on. Important questions but hardly worth a conference panel (IMHO) - those are important software vendor considerations, and it is important for all organizations to understand how these things work, but not the “black magic” or “sector transforming phenomenon” that a conference organizer would want you to think.

The REAL issue is how large datasets (with all the legitimate questions raised about bias, consent and purpose) are being interrogated by proprietary algorithms (non-explainable, opaque, discriminatory) to feed decision making in the public and private sectors in ways that FUNDAMENTALLY shift how the people and communities served by nonprofits/philanthropy are being treated.
  • Biased policing algorithms cause harm that nonprofits need to understand, advocate agains, deal with, and mitigate. 
  • AI driven educational programs shift the nature of learning environments and outcomes in ways that nonprofit after-school programs need to understand and (at worst) remediate, (at best) improve upon. 
  • The use of AI driven decision making to provide public benefits leaves people without clear paths of recourse to receive programs for which they qualify (read Virginia Eubanks’s Automating Inequality). 
  • Algorithmically-optimized job placement practices mean job training programs and economic development efforts need to understand how online applications are screened, as much as they help people actually add skills to their applications.
This essay on “The Automated Administrative State” is worth a read.

The real question for nonprofits and foundations is not HOW will they use AI, but how is AI being used within the domains within which they work and how must they respond?



* I try to avoid any conversations that are structured as “_____ for (social) good” and all situations that are “_[blank]_ for social good” where the [blank] is the name of a company or a specific type of technology.
Author: Lucy Bernholz
Posted: April 8, 2019, 3:31 pm
Yep, deliberately trying to provoke you with the headline. Here's what provoked me:

The news that two airplane crashes killed a total of 346 people, in part due to a software upgrade that was "optional." (read: cost more)

This story about electronic health records (software) and deaths that ensued from resultant poor medical care.

What does this have to do with philanthropy and civil society?

Philanthropic and civil society organizations are as dependent on software as are businesses and governments. Do you know how your software works? What its vulnerabilities are?

Your work may not involve the difference between life and death, but if you're collecting information on lots of people and not respecting their rights in collecting it, not protecting it once you have it, or managing it (and the software you use to hold and analyze it) in line with your mission, how much good are you really doing? Are you making the people your organization serves, or the nonprofits you fund, more vulnerable with your data practices even as you try to do good with your dollars?
Author: Lucy Bernholz
Posted: March 23, 2019, 1:43 am
The World Food Programme recently announced a partnership with Palantir. There’s a lot to be concerned about here - in this specific instance and in the broadest possible sense for understanding civil society’s role regarding digital data use. Please read this open letter

https://responsibledata.io/2019/02/08/open-letter-to-wfp-re-palantir-agreement/


Does your organization know how the tech companies you partner with use the data you have on your constituents? It’s not just about “ownership,” but the insights and algorithms and predictive tools that may be built on that data. Are you “locked in” to a platform - if you wanted to switch vendors for your CRM or mailing lists or social media - can your organization get your data back?

How is your organization managing data? With any more respect for the individual people from whom it comes than these big software/analytics/insight companies? If not, why should anyone trust you with their data?

These are pivotal questions - and we need answers and processes and protocols and regulation. Civil society is not meant to be either a poor cousin of big business, an outsource arm of government, or the “data washing” arm of either.
Author: Lucy Bernholz
Posted: February 8, 2019, 7:04 pm
The tenth annual Blueprint - Blueprint 2019 - went live in December. You can find it and the entire decade's archive here.
On January 23rd we'll be hosting our annual "prediction-palooza" (free, online) discussion about philanthropy predictions. Information on that is available here.
In the meantime, I've just come off a conversation with a research group preparing a report for a big foundation on the future of philanthropy. I get asked to do a lot of these. I only agree to these conversations if there is going to be a public version of the report. I'm told that's the case - this report should be available in April.

Some thoughts as I was asked to reflect on the last 5 years and then look ahead 10 years.

Looking back: 
All the new products, platforms and ways to give (DAFs, impact investing, crowdfunding platforms, text donations, cause marketing, etc.) are not adding up to more giving by more people. As Indiana University notes, since 2000 we've lost 20 million givers - at least as recorded by data on charitable donations to tax exempt nonprofits. This is over the same 19 year time frame that brought us online/mobile donations, online data resources about nonprofits,

-> Perhaps we can stop assuming product innovation equals growth?

Where have the "missing millions" gone? I doubt people have given up on giving, after all we've been doing it for thousands of years. I think we have an "instrumentation" problem. Which is to say, we're measuring the wrong things. Changes in the tax code are expected to result in changes in giving to nonprofits (and to changes in how useful itemized tax forms will be for measuring giving behavior).

--> Perhaps we can ask whether we're measuring the right things?
We need new instruments to measure how Americans give and to whom. It should include measurements of person-to-person giving (e.g., as happens on crowdfunding platforms), political donations and contributions of time, investments intended to produce social/environmental outcomes, and money raised via product purchases (BOGO or cause marketing). I've been calling for this since at least 2015 - see here, and had intimations about it back in 2008 (see here).

Looking Ahead:

Commercial platforms and nonprofit data:
Does anyone really think Facebook is going to be the same thing in 2029 that it is in 2019? Not even the folks at FB would say that. Every nonprofit and philanthropist that is managing their giving, their outreach, their donors, etc. on this single platform should beware. The rules are going to change, the compan(ies) will push back, there will be all kinds of ups/downs between now and 10 years from now - but in no imaginable future is doing right by nonprofits (in terms of their data and longterm relationships with donors) in the growth plans of that company. If you can imagine either a different FB in 10 years or no FB 10 years from now, then it seems like a good idea not to put all your data eggs in a FB basket. (Or any commercial platform driven by a business model unrelated to how well it serves the social sector).

Charity and Politics:
The legal line between these two domains is man-made. It's changed over time. It's thin and it's fragile. The current presidential administration is determined to destroy it (see Johnson Amendment fight, as well as court cases on donor disclosure in politics). There's not enough manpower at the IRS to defend any boundary that might exist. Digital data and systems make defending the line even more difficult than they were in the analog age. Many advocates and advocacy organizations would like to see the line gone. Individual people may not care as much about separating political and charitable action as policy makers and nonprofit leaders want them to. Assuming the old boundaries between these domains function as intended is fooling oneself. We should put our attention into writing rules that protect our (often conflicting) rights (association, expression, privacy), sheds light on political corruption and the excessive influence of wealth, and assumes digital data collection and exposure, rather than nostalgically assuming that protecting a legal distinction established in the 1950s is the best (or only) way forward.

Shiny Objects
Anyone but me noticing that the marketing hype about blockchain is starting to quiet down, just as people building digital systems that focus on digital security and encryption are growing in actual use? This is a good thing - stop gawking at the new packaging and let's focus on the values that particular activities require. In some cases, permanent records of transactions are a good thing (supply chain verification of objects, possibly). In other cases, distributed, immutable records may not be such a good idea (human ID, for example).

Artificial intelligence (AI), predictive algorithms, and machine learning are three more shiny objects. Most people ask me "How will nonprofits and/or philanthropists change by using AI?" I think this question has the subject and object in wrong order. The more relevant question for most donors and social sector organizations is "How will the use of AI (etc.) change what the organizations/donors are doing?" Government bodies and commercial companies are already using these tools - they shape what you see online, what benefits you qualify for, your chances of being audited by the tax authority, your chances of getting a speeding ticket, the keywords you need to enter in your job application, etc. etc. They are changing the nature of the problem space in which social sector organizations and philanthropists do their work. This is not the future, this is now. This is not the edge case exceptions of a few organizations wih some good data and data scientists, this is the great mass of organizations and donors. I'd love to see some real discussion of how philanthropy and social sector organizations can and should change to be effective in a world already being shaped by AI (etc.). Then, for dessert, we can talk about the exceptions to this rule.

It's the nature of the game that we'll chatter about the latest shiny object. What's much more interesting is how we embody shared values in new ways.



Author: Lucy Bernholz
Posted: January 2, 2019, 8:25 pm
Like many of you I woke up this morning to an email inbox full of leftover Black Friday ads, a whole bunch of Cyber Monday ads, and the Xth day in a row of #GivingTuesday announcements.

Among those was the first clearly-designed-to-misinform #GivingTuesday astroturf email that I've received.

It came from the Center for Consumer Freedom (CCF) - a nonprofit front group run by a lobbyist for a variety of industries including restaurants, alcohol, and tobacco. The umbrella group for CCF - the Center for Organizational Research and Education (CORE) - is also home to HumaneWatch. According to the 2016 990 tax filing for CORE, HumaneWatch exists to "educate the public about the Humane Society of the United States (HSUS), its misleading fundraising practices, its dismal track record of supporting pet shelters and its support of a radical animal rights agenda."

(clip from 2016 990 for CORE)

The email I received from CCF linked to a YouTube "ad." But all of it - the website consumer freedom, the email I received, the work of these nonprofits - all lead back to a commercial PR firm Berman and Co, which has been accused of setting up these groups as part of their paid work for industry. None of this was revealed in the email - and if you look at the website for CCF to find out who funds it you find this statement:
"The Center for Consumer Freedom is supported by restaurants, food companies and thousands of individual consumers. From farm to fork, from urban to rural, our friends and supporters include businesses, their employees, and their customers. The Center is a nonprofit 501(c)(3) organization. We file regular statements with the Internal Revenue Service, which are open to public inspection. Many of the companies and individuals who support the Center financially have indicated that they want anonymity as contributors. They are reasonably apprehensive about privacy and safety in light of the violence and other forms of aggression some activists have adopted as a “game plan” to impose their views, so we respect their wishes."
If you check the CCF's 990 form (Search under CORE) you'll find that on revenue of $4.5 million (sources undisclosed), the largest expense was $1.5 million paid to Berman and Co, for management fees. Next largest expense is $1.4 million spent on advertising and promotion.

There's no virtue in this circle - just paid lobbyists setting up nonprofit groups to counter the messages of other nonprofit groups. On the one hand, the nonprofit sector must be doing something right when the tobacco and alcoholic beverage industries are trying to shut them up. On the other hand, good luck to you - average donor - trying to figure out what's real and what's not. Even the watchdog groups are sniping at each other

I've written before about misinformation, the current ecosystem of distrust, and civil society. And here it is. Be careful out there.
Author: Lucy Bernholz
Posted: November 26, 2018, 6:15 pm
Three tweets from yesterday:




Depending on a commercial company for our giving infrastructure is problematic in several ways. First, at any point in time the company (and this company has done this repeatedly) can change it's commitment, algorithm, priorities and leave everyone who was using it without recourse. Second, we have no way of knowing that the company's algorithms are offering all the choices to all the people. How would you even know if your nonprofit or fundraising campaign wasn't being shown to those you were trying to reach? Third, Facebook owns this data and can tell us whatever they want about it. Maybe $1 billion was given, maybe it was more, maybe it was less - how would we know?

There's an existing infrastructure for measuring giving in the U.S. and a number of research centers that analyze and report on those trends every year. That infrastructure - from 990 tax forms to The Foundation Center, Guidestar, the Urban Institute, and independent research from Giving Institute or the Lilly School at Indiana U - was built for the purpose of public accountability, to protect the democratic values of free association and expression, and for industry-wide performance improvement. This infrastructure is not perfect. But the data they use and their analytic methods can be checked by others - they can be replicated and verified following the basic tenets of sound scientific practice and good evidence practices for policymaking.

 There needs to be new ways to understand what's happening on these proprietary platforms - especially if Facebook is moving $1 billion and GoFundMe $5 billion. Those are big numbers about our nonprofit sector. We need to be able to interpret these data, not just reflexively believe what the companies announce.
Author: Lucy Bernholz
Posted: November 17, 2018, 6:51 pm
I've had countless conversations with well-intended people from a number of social sectors and academic disciplines who are working on digital innovations that they firmly believe can be used to address shared social challenges. Some of these approaches - such as ways to use aggregated public data - are big investments in unproven hypotheses, namely that making use of this data resources will improve public service delivery.

When I ask these folks for evidence to support their hypothesis, they look at me funny. I get it, their underlying hypothesis that better use of information will lead to better outcomes seems so straightforward, why would anyone ask for evidence? In fact, this assumption is so widespread we're not only not questioning it, we're ignoring countervailing evidence.

Because there is plenty of evidence that algorithmically-driven policies and enterprise innovations are exacerbating social harms such as discrimination and inequity.We are surrounded by evidence of the social harms that automated decision making tools exacerbate - from the ways social media outlets are being used to the application of predictive technologies to policing and education. Policy innovators, software coders, data collectors need to assume that any automated tool applied to an already unjust system will exacerbate the injustices, not magically overcome these systemic problems.

We need to flip our assumptions about applying data and digital analysis to social problems. There's no excuse for continuing to act like inserting software into a broken system will fix the system, it's more likely to break it even further.

Rather than assume algorithms will produce better outcomes and hope they don't accelerate discrimination we should assume they will be discriminatory and inequitable UNLESS designed specifically to redress these issues. This means different software code, different data sets, and simultaneous attention to structures for redress, remediation, and revision. Then, and only then, should we implement and evaluate whether the algorithmic approach can help improve whatever service area they're designed for (housing costs, educational outcomes, environmental justice, transportation access, etc.)

In other words, every innovation for public (all?) services should be designed for the real world - which is one in which power dynamics, prejudices, and inequities are part of the system into which the algorithms will be introduced. This assumption should inform how the software itself is written (with measures in place to check for and remediate biases and amplification of them) as well as the structural guardrails surrounding the data and software. By this I mean implementing new organizational processes to monitor the discriminatory and harmful ways the software is working and the implementing systems for revision, remediation and redress. If these social and organizational can't be built, then the technological innovation shouldn't be used - if it exacerbates inequity, it's not a social improvement.

Better design of our software for social problems involves factoring in the existing systemic and structural biases and directly seeking to redress them, rather than assuming that an analytic toolset on its own will produce more just outcomes. There is no "clean room" for social innovation - it takes place in the inequitable, unfair, discriminatory world of real people. No algorithm, machine learning application, or policy innovation on its own will counter that system and its past time to keep pretending they will. It's time to stop being sorry for or surprised by the ways our digital data-driven tools aren't improving social challenges, and start designing them in such a way that they stand a chance.
Author: Lucy Bernholz
Posted: November 16, 2018, 5:23 pm
You know the old trope about how people looking on the ground around a streetlamp for their lost keys, even if they lost them down the block, simply because "that's where the light is?"
 (http://creepypasta.wikia.com/wiki/The_Man_Under_the_Street_Light)

This is a little like how I've been thinking about generosity. We associate generosity - especially in the U.S. - with charitable giving to nonprofits. Everything else - volunteering time, giving to politics, direct gifts to neighbors or friends or others, mutual aid, remittances, shopping your values, investing your values - those things are all something else.

And, yes, the motivational and behavioral mix for these actions may be different. But we make a mistake when we center the one - charitable giving - and shift everything else to the edge and think that's based in human behavior. It's actually based in politics and industry.

In the U.S we've built an infrastructure of organizations (nonprofits) that take up a lot of space in the generosity mix. And we make them register with the government which allows us to count them. And we require them to report certain actions which then allows us to  track giving to them. Those decisions were political - and have to do with values like accountability and association and expression.

On top of those registries and reports we've built big systems and organizations to make sense of the information. Some of those institutions (Foundation Center) were built as an industry response to possible regulation. Some of those institutions (Guidestar) were built because there was a huge data set of nonprofit tax forms that existed by the 1990s. These data sets served as the "lights" that helped us "see" specific behaviors. It wasn't that other behaviors weren't happening, it's just that there weren't lights shining on them.

(https://www.videoblocks.com/video/seamless-looping-animation-of-classic-wooden-house-with-lights-in-a-beautiful-wintry-landscape-at-night-4dty2hnoxijrh8vsk)

Shining a light on these behaviors was done to better understand this one type of generous act - it wasn't done with the intention of judging the others as lesser. But over time, all the light has focused on charitable giving to nonprofits making it seem like the other behaviors weren't happening or were less important, just because the light was not shining there.

The more the full mix of behaviors happens on digital platforms, the more lights get turned on. Where it is hard to track a gift of cash to a neighbor in need, crowdfunding platforms that facilitate such exchanges (and credit card trails) bring light onto those actions. And because more and more acts take place on digital platforms - Facebook claims to have moved $1 Billion in last year - we can now see them better. The digital trails are like shining new lights on old behaviors.

Think of it like a house of generosity. In one room are donations to charitable nonprofits. In the USA, the lights have been burning bright in this room for decades. In another room is contributions to houses of worship. Down the hall is the room of money to neighbors/friends in need. Another room is where shopping for some products and not others happens. Downstairs is investing in line with your values. There's a room for political funding and and one for spending time rallying around a cause. Other rooms hold remittances or cooperative funds or mutual aid pools. As each of these behaviors shifts to use digital platforms - be it online portals, social media, texting, or even just credit card payments - its like turning on the light in those rooms. We can "see" the behaviors better, not because they're new but because the digital trails they create are now visible - the light is shining in all those
rooms.
(https://www.123rf.com/photo_27996382_big-modern-house-with-bright-lights-on-in-windows-on-a-beautitul-summer-evening.html)

Digital trails shine lights on lots of different behaviors. We can see things we coudn't see before. It's going to be increasingly important that we have public access to data on what's going in the whole house, not just certain rooms. Right now, the data on many of these behaviors is held in closed fashion by the platforms on which the transactions happen - crowdfunding platforms know what happens on them, Facebook tells us what happens there, and so on. We're dependent on the holder of the light to shine it into certain rooms. This isn't in in the public's interest. Having the lights turned on is better than being in the dark, but having public access to the light switches is what really matters.

Author: Lucy Bernholz
Posted: November 14, 2018, 8:22 pm
I've been talking to a lot of nonprofit and foundation folks + software developers lately. The good news is these two communities are starting to work together - from the beginning. But there is a long way to go. Just because you're working in or with a nonprofit/social sector/civil society organization doesn't mean unleashing the most sophisticated software/data analytic techniques is a good thing. In fact, using cutting edge algorithmic or analytic techniques that haven't been tried before in an effort to help already vulnerable people is quite possibly a really bad idea.

I've come to believe that the first question that these teams of well meaning people should ask about whatever it is they're about to build is:
"How will this thing be used against its intended purpose?"
How will it be broken, hacked, manipulated, used to derail the good intention it was designed for? If the software is being designed to keep some people safe, how will those trying to do harm use it? If it's intended to protect privacy, how will it be used to expose or train attention in another dangerous way?

Think about it this way - every vulnerable community is vulnerable because some other set of communities and structures is making them that way. Your software probably doesn't (can't) address those oppressive or exploitative actors motives or resources. So when you deploy it it will be used in the continuing context of intentional or secondary harms.

If you can't figure out the ecosystem of safety belts and air bags, traffic rules, insurance companies, drivers' education, and regulatory systems that need to help make sure that whatever you build does more help than harm, ask yourself - are we ready for this? Because things will go wrong. And the best tool in the wrong hands makes things worse, not better.
Author: Lucy Bernholz
Posted: October 21, 2018, 5:02 pm
A lot of work on responsible data practices in nonprofits has focused on staff skills to manage digital resources. This is great. Progress is being made.

Digital resources (data and infrastructure) are core parts of organizational capacity. We need to help board members understand and govern these resources in line with mission and in safe, ethical and responsible ways.

Digital data and infrastructure need to become part of the regular purview of boards in thinking about liabilities and line items.
  • Ongoing budgeting for staff (and board) training on responsible data governance 
  • Making sure practices are in place - and insurance purchased when practices fail - to protect the people the organization serves when something goes wrong 
  • Understanding the security and privacy implications of communicating digitally with volunteer board members
  • Horizon scanning on ethical digital practice and opportunities
Digital data governance is as much a part of running an effective organization as are financial controls and good human resource practices. We need to help board members lead.
Author: Lucy Bernholz
Posted: October 12, 2018, 6:13 pm