UAlbany News Podcast

Automating Inequality, with Virginia Eubanks

Episode Summary

Virginia Eubanks is an associate professor of political science at UAlbany's Rockefeller College of Public Affairs & Policy. On this episode of the UAlbany News Podcast, Eubanks shares about her book, 'Automating Inequality: How High-Tech Tools Profile, Police and Punish the Poor.'

Episode Notes

Virginia Eubanks is an associate professor of political science at UAlbany's Rockefeller College of Public Affairs & Policy. On this episode of the UAlbany News Podcast, Eubanks shares about her book, 'Automating Inequality: How High-Tech Tools Profile, Police and Punish the Poor.'

In the book, she details three examples of technology failing to streamline welfare programs:

• an effort to automate eligibility processes for public assistance programs in Indiana
•an electronic registry of the homeless in California
•a statistical model in Pennsylvania that attempts to predict child maltreatment

These automated public service systems are designed to serve some of the country’s most vulnerable populations, such as those living in poverty or contending with poor health, while at the same time saving the government time and money. But these technologies can leave poor families feeling tracked, targeted and trapped.

Eubanks explains how these systems fail to remove human bias, exacerbate inequality and perpetuate a "Digital Poorhouse" for working-class people in America.

The UAlbany News Podcast is hosted and produced by Sarah O'Carroll, a Communications Specialist at the University at Albany, State University of New York, with production assistance by Patrick Dodson and Scott Freedman.

Have a comment or question about one of our episodes? You can email us at mediarelations@albany.edu, and you can find us on Twitter @UAlbanyNews.

Episode Transcription

Sarah O'Carroll:
Welcome to the UAlbany News Podcast. I'm your host Sarah O'Carroll.

Sarah O'Carroll:
Our guest this morning is Virginia Eubanks, an Associate Professor of Political Science in UAlbany's Rockefeller College. She is here with me today to talk about her book Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. Virginia, thank you so much for being here.

Virginia E.:
Thanks so much for having me.

Sarah O'Carroll:
It's been quite a year since your book's publication last year. It seems like you've been all over the map.

Virginia E.:
Yeah, it's been a really exciting year. A lot of great conversations with people, yeah, all over the country and all over the world about some of these really important issues that I was able to report on in Automating Inequality. I've been really excited and really humbled and honored to be part of this larger conversation.

Sarah O'Carroll:
Very cool. Now, in Automating Inequality, you argue that we are building a digital poorhouse within our public service programs. What do you mean by this term digital poorhouse just so that anyone listening could be on the same page about what you mean?

Virginia E.:
Yeah. When we talk about the new sexiest, sparkliest digital tools in public assistance, we have a tendency to talk about them as if they're profoundly disruptive, but one of the things I found in my research and reporting is that many of them are much more evolution than revolution. In order to understand the way that the tools were being developed and how they were being implemented, it was really important to understand the history of our poverty policy in the United States. I go all the way back until 1819, which is a moment where we as a political community decided that we wanted to raise the barriers to receiving public aid high, so high that no one but the most desperate people would ever ask for help. What we did at that time was invent this thing called the County Poorhouse.

Virginia E.:
The poorhouse was a brick and mortar institution for basically incarcerating anyone who asked for help. If at the time you had these rights, you had to give up a number of your rights to enter the poorhouse, your right to vote or hold office, your ability to marry. You couldn't marry if you were in the poorhouse. You're often separated from your children because it was believed that poor children could be redeemed by having more access to richer families, and by access they tended to mean doing agricultural or domestic labor.

Virginia E.:
The most famous of these institutions, like the Tewksbury Poorhouse in Tewksbury, Massachusetts had death rates as high as 30% annually, so like a third of people who entered them every year died. You were really taking your life in your hands when you entered the poorhouse. The reason that I ground the history I tell about our new digital tools about predictive analytics or about big data is because I believe that that moment provides the deep legacy programming of these new systems. It's this moment where we say that the first and most important thing our public service programs can do is to do a kind of moral diagnosis, deciding who really deserves public assistance and who does not. That's what we see coming up again and again in the new tools that I talk about in the rest of the book.

Sarah O'Carroll:
Now, you looked at three examples where technology failed to streamline these welfare programs. Why did you choose these three and their respective locations?

Virginia E.:
Yeah, I talk about three stories. The first is an attempt to automate all of the eligibility processes for the state of Indiana's welfare programs, that's cash assistance or TANIF, what used to be called food stamps is now called SNAP, and also Medicaid or medical insurance for poor and working families. On the second story I talk about is about a tool called coordinated entry or the coordinated entry system, which is actually in use across the country and increasingly around the world. Its proponents call it the match.com of homeless services. It's supposed to match the most vulnerable unhoused people with the most accessible available resources or the most appropriate available resources. I studied that in Los Angeles County.

Virginia E.:
The last model that I... The last technology I look at is a statistical model in Allegheny County, Pennsylvania called the Allegheny Family Screening Tool, which is supposed to be able to predict which children might be victims of abuse or neglect sometime in the future. I thought a lot about how to choose the cases I was going to look at in the book. Indiana really is the first case that came to my attention and it's really extensively documented because there was a very high profile court case when the state eventually actually broke the contract they had with IBM to build this tool.

Sarah O'Carroll:
Which court case?

Virginia E.:
Indiana versus IBM is what it was called. Because there was such good records, it made it a really interesting story to follow. Once I decided that I wanted to work with the Indiana story, what I wanted to do is push people to think a little bit about cases where the intentions of the designers were really, really good, where they were really smart people and where they were really committed to some of the basic things that progressive critics of algorithmic decision making ask for. Often we talk about the sort issues that arise around this automated decision making systems, we ask for transparency on ability to see what's in the tool, accountability that it's held in public somewhere so that we have some kind of democratic control over it and sometimes participatory design or a role for people who are going to be using the tool and designing it.

Virginia E.:
In Los Angeles and Allegheny County, the designers of the tools actually did all three of those things, but they still manage to produce technologies that from the point of view of targets of poor and working class families feel really dangerous. I really wanted us to push on our easy solutions and think about what it says about public assistance in the United States that we're producing these tools that increasingly risk rate and do a kind of moral diagnosis of poor and working people.

Sarah O'Carroll:
If these groups are doing something in their best light and they are following those things and something is still missing, then there has to be a greater issue here.

Virginia E.:
Yeah, I think that's true. I think one of the things that sometimes happens when I talk about the book is people will kind of roll their eyes at me and be like, "Oh, you just picked the worst cases you could find just to write a really scary book so everybody would buy it." I tend to say unfortunately that's not at all true and that the second two cases I study in the book are really some of the best technologies we have, not some of the worst and they still leave us asking some pretty important questions about how we're dealing or not dealing with poverty and economic inequality in the US.

Sarah O'Carroll:
Now, what were some of the questions that you had or what were some of the biggest takeaways from each of these locations?

Virginia E.:
Yeah, that's a great question. The Indiana case resulted in about a million benefits denials in the first three years of the experiment, which was a 54% increase from the three years before the experiment. This is a really significant burden on poor and working families in Indiana. I think there's a couple of really important issues that the story raises. The first is about the relationship between case workers and the people they serve. Part of this system was a move from what's thought of as a case based case work, which means you're responsible for a caseload or a group of families who you often develop a relationship with and follow over time, to what's known as a task based system, where rather than having relationships with families, you as a caseworker moved to a regionalized call center and you just respond to a list of tasks as it drops into your queue in a workflow management system.

Virginia E.:
This felt like a really big change in the way that we provide public assistance in the United States. From the point of view of workers, it felt to them like they never had any context on what was happening in people's lives. Where in the past they could say like, "Oh, it looks like you're not going to be eligible for food stamps." They could say, "But there's a food pantry in your town and it's open Tuesday nights." That was something they couldn't do anymore. From the point of view of applicants or recipients, it felt like if a mistake was made, and a lot of these application processes are actually really complicated and very lengthy and extensive, so mistakes happen. It felt to them like if a mistake was made because they didn't have a relationship with a caseworker, it was really up to them to figure out what had gone wrong and to fix it. That's an extraordinary burden to put on the shoulders of some of the most vulnerable families in the state.

Sarah O'Carroll:
Well sure.

Virginia E.:
That that really feels like one of the really important issues that the Indiana case raises. I think in Los Angeles, the coordinated entry system is increasingly popular across the country and around the world. Paying attention to it in Los Angeles was particularly interesting just because of the real severity of the housing crisis in Los Angeles County. There are 58,000 unhoused people in Los Angeles County. I live in Troy, New York, a small city in upstate New York, and that's my entire city plus 8,000 people homeless in Los Angeles. The severity of this human rights crisis is really, really obvious anywhere you go in LA. There are encampments in just about every neighborhood in LA now. I absolutely understand the impulse to try to find a way to prioritize people, to try to find a way to get the help that's needed to the people who need it most.

Virginia E.:
Unfortunately though, I sometimes fear that these systems act as a kind of, at their worst, act as a kind of empathy override. I fear sometimes that this idea of matching need to resource allows us to escape the bigger issue, which is that we simply don't have enough housing or we're not making available enough housing to really deal with the housing crisis in the United States, really deal with the incredible volume of unhoused people who are living in streets and under bridges and in cars around the country.

Virginia E.:
Finally in Allegheny County, that's the most complicated story I tell both I think technically and ethically. This is a tool that is supposed to be able to predict people's actions in the future. I think that raises really profound questions for democracy. I think that having more and more reliable data when you're making really difficult decisions like whose family is investigated for maltreatment is an important and valuable thing. But I also feel... Well, I can say from the point of view of families who are interacting with the system, they feel like they are being dehumanized, that their complex lives are being flattened to a handful of data points-

Sarah O'Carroll:
Just numbers and not humans behind it.

Virginia E.:
Yeah and specifically that it makes guesses about what they're going to do in the future. To them, to the families I spoke to, this really feels like an ethical breach to them because probabilities never actually describe the action of any one person. They feel very much like the way the system is set up confuses parenting while poor with poor parenting, surveils their communities and sees them as risky to their families for things that are largely out of their control. They're really concerned with the way that this risk score, this red number that they might get, will follow them and even follow their children or their grandchildren as they go about forming their own families and living their own lives. I think that raises some really important and really vexing questions for how we want government to work and what we think families need to be safe and healthy. Yeah, that I think is one of the main issues that gets raised in Allegheny County.

Sarah O'Carroll:
I want to ask what's next or how can legislators or policymakers take your research and improve these systems? Because what we're using now is not doing justice, it's not helping the most that it could be.

Virginia E.:
Yeah. I think the most important thing... The thing that really makes the book different, there's been just a fantastic... really a dozen or more great books about the hazards of algorithmic decision making that came out in the last year or so. There's just really fantastic work. I'm really excited to be part of that larger conversation. But I think one of the things that automating inequality does a little differently is I did a 105 interviews for this book and I talked to designers and I talked to administrators and I talked to academic economists and policymakers and frontline caseworkers, but each of these stories... Every time I started a story, I started from the point of view of people who see themselves as targets of these systems, people who are trying to get public assistance, unhoused folks, either who are still living out or who have found housing, and families who have been interacting with the child welfare system. I think that is the reason that I was able to tell stories in different ways than other people have been been able to around these topics.

Virginia E.:
I think the lesson that that offers is that you don't actually need to know the sophisticated workings of machine learning and artificial intelligence in order to make good decisions about how we govern social programs. I think that one of the real hazards of these tools is that they operate under a kind of obscure anti-ism, right? They make the workings of the program obscure and difficult to understand. I think that's a profoundly anti-democratic idea and force. I think the single most thing that... the single most important thing that policymakers could do is a rethink who the experts are around these systems when we're making decisions around these systems. That means not just the folks who know how to write a machine learning algorithm, but also the people who know the most about the policy area itself.

Virginia E.:
If you're in child welfare, it should be caseworkers, it should be administrators, it should be nonprofits, but it should able parents and children. Too often, the folks who are most impacted by these tools are not consulted at least, but hopefully in a larger sense, really involved in processes of decision making around how we're going to use these tools. I think the primary problem we have is we tend to see these digital tools as just like administrative upgrades. The reality is that they're political decision making machines. If we're allowing them to make political decisions for us, we have to be involved in how they're designed, how they're implemented and how we're talking about them. I think that kind of deep involvement in these processes is the most important thing that we can do to reach more just outcomes.

Sarah O'Carroll:
The last time we communicated, you shared that you were starting a new book project about the American Poor People's Campaign, the movement that Martin Luther King Jr. started. Where are you in this project and what's ahead?

Virginia E.:
Here's what I say now when I get asked this question. I'm working on a number of different things, but I... The joke I usually tell is that I've decided to date before I marry my next project. I realized at the end of this book that it takes me seven or eight years to write a book and I'm nearing 50, so I only have many chunks, seven or eight year chunks of my life left. I want to be really sure that I'm committed to a project before I marry it. I'm working on a number of investigative long form pieces, nonfiction pieces over the next year that will feel like punctuation at the end of this book, things that I didn't get a chance to talk about or things that have developed since the book came out that feel like it's important for them to be part of the conversation.

Virginia E.:
I'm not sure what will be next for me. I am really excited about the poor people's movement in the US. I'll say I ground that though in the National Welfare Rights Movement and not necessarily in the wonderful work around the Poor People's Campaign done by the Southern Christian Leadership Conference and Martin Luther King. And specifically, there is an amazing movement in this country called the Poor People's Economic Human Rights Campaign that is doing some really fantastic work to help people understand poor as a political identity and to develop new models of leadership that mean that poor and working people can be in control and leadership of their own movements.

Virginia E.:
I'm really excited about that work and I think it's something that many people don't see or know very much about. I'd love to do that work. I'm not committing to it at this point though. There's too many things to do and not enough time, which is a great problem to have.

Sarah O'Carroll:
For sure. This was your third interview of the day and you've got a three hour class later this evening. Virginia, thank you so much for making this part of your day.

Virginia E.:
Oh, thank you so much for having me. It was a great conversation.

Sarah O'Carroll:
Thank you for listening to the UAlbany News Podcast. I'm your host, Sarah O'Carroll and that was Virginia Eubanks, an Associate Professor of Political Science in UAlbany's Rockefeller College. If you're interested in the political science perspective, you might want to check out a recent podcast I did with Timothy Weaver of Rockefeller College. He spoke with me about his work analyzing opportunity zones. You can let us know what you thought of this episode by emailing us at mediarelations@albany.edu. You can find us on Twitter @UAlbanyNews.