Disability Culture Lab on AI, Policy, and Creative Power

Meier Galblum Haigh and Kenrya Rankin


AI is often framed as either a miracle tool or a looming threat, but what does it actually mean for disabled artists navigating grants, fellowships, and creative partnerships? In this episode, Meier Galblum Haigh [Hague] and Kenrya Rankin from Disability Culture Lab take over the GIA podcast to unpack the realities of AI, from access tools to authorship, labor, and institutional policy. Listen for an examination on how justice-minded organizations can channel conversations about AI that center on disabled artists’ autonomy, creativity, and care.


Connect with Disability Culture Lab

Website | LinkedIn | Instagram | Facebook | BlueSky

Resources + Links


Jamie Sharp:

Hello. My name is Jamie Sharp and I'm the senior program manager at Grant Makers in the Arts. Thank you for tuning in for another episode of the GIA Podcast. AI is often framed as either a miracle tool or a looming threat, but what does it actually mean for disabled artists navigating grants, fellowships, and creative partnerships? In this episode, Meier Galblum Haigh and Kenrya Rankin from Disability Culture Lab take over the GIA podcast to unpack the realities of AI from access tools to authorship, labor, and institutional policy. Listen for an examination on how justice-minded organizations can channel conversations about AI that center disabled artists' autonomy, creativity, and care. I hope you enjoy.

Meier Galblum Haigh:

Hello. Well, first off, hello and welcome. Hello to our audience. This is a Disability Culture Lab guest takeover of the Grant Makers in the Arts Podcast, and we are so happy to be here. So we're going to start off with some introductions since we are both new here.

Kenrya Rankin:

We are. So I am Kenrya Rankin. My pronouns are she and her. I am a dark-skinned, disabled black woman with chin length locks, rose gold granny glasses that are on a chain and really bright pink lipstick because it makes me happy. I serve as managing director at Disability Culture Lab. I'm originally from Cleveland, Ohio, but live in DC with my kid. And my whole thing is creating dynamic, high impact content that amplifies the lived experiences, advocacy, art, labor, all the things of disabled people of color to help shift the narrative around who deserves liberation, joy, justice, dignity, all the big things here in this country. And I am really excited because I get to do that every day via our work with DCL.

Meier Galblum Haigh:

Ooh, love that. Welcome, welcome. And I am Meier Galblum Haigh. My pronouns are they and them. I'm the founding executive director of the Disability Culture Lab. I am a non-binary trans mask disabled human and a quick visual description of me since none of y'all can see me out there. If you are with me, I am a trans mask disabled human and I've got dark, curly hair, short dark curly hair, very Ashkenazi Jewish coated. I'm white and I've got glasses as well. Reading glasses. And I'm often wearing Disability Culture Lab streetwear. Today I'm wearing a rainbow cure ableism sweatshirt. I live life on wheels, so you'll usually see me on my mobility scooter. Although if you see me on Zoom, I'm often laying down. We're an all disabled team here at Disability Culture Lab.

Kenrya Rankin:

Yeah.

Meier Galblum Haigh:

So you'd see both of us laying down actually on this lovely podcast if you had video. I was born and raised in Fairbanks, Alaska, and I've been organizing at the intersection of social movements and media for the last 20 years or so. And now I'm organizing from my home in the District of Columbia where I live with my spouse and my very cute six-year-old who's home today on a snow day because it has been quite the storm here.

Kenrya Rankin:

It has indeed. And it's funny because that reminds me of how different it is when your kids are different ages because my kid is 14. They have no inclination to go out into the snow today.

Meier Galblum Haigh:

And mine has been sledding up a storm.

Kenrya Rankin:

Yes. I love that. Well, while yours is sledding and mine is ignoring me because of the snow, we are going to have a conversation today. But before we really jump into the interview of it all, do you want to tell people a bit about what Disability Culture Lab is and why we exist?

Meier Galblum Haigh:

Absolutely. So I'll start off with that. And then I'm going to interview you, which I'm very excited about today. But the Disability Culture Lab is a nonprofit disability media and narrative lab. And our mission here is to shift the narrative on disability from fear and pity to solidarity and liberation. And we build strategic communications infrastructure by and for the disability community. And if you're wondering what is strategic communications, if you're listening here, and what does that have to do with what I do? We see everything that shifts culture as included in strategic communications. So that's everything from PR to digital communications and the arts. And we ground everything that we do here at Disability Culture Lab in disability justice. And we're working to help people understand disability differently, not just as an accident or something that happens to folks on a whim, but as a policy choice, because every form of oppression from racism to colonialism to capitalism is disabling, and disability is the end of every oppression story. So we're really working to change the way people think about disability every single day.

And today, of course, we are talking about AI and the arts because AI itself has stirred up a huge conversation in our community right now with the way that tech companies have moved incredibly quickly to shift the ground under the feet on which we are walking or bowling. And there's been a lot of conversation right now about disability and capitalism and inequality in our communities. And it's a topic that you Kenrya have dedicated the last few years of your career to at Mozilla Foundation, and then here with us at Disability Culture Lab. So I was wondering if maybe you could start by telling us a bit about what got you interested in AI in the first place.

Kenrya Rankin:

I can indeed. And it basically requires me going way, way back because I'm super old. So growing up, I was a nerd, but not just one kind of nerd, I was all the kinds of nerds. I was an academic nerd who did well in school. I was a band nerd. I played clarinet and I was in a marching band. I was an orchestra nerd. I was first chair violin. I was also a sci-fi nerd. If there was a bizarre novel or a movie that I could get my hands on, I was going to watch it and I was probably going to watch it a whole bunch of times. It's to the point now where my kid is like, so you just know all the words to every movie and the answer is yes. But as a very grown adult, that hasn't really changed much. I've just found other ways to nerd out and those include reading manga. My kid and I like to go to late night anime screenings and we go to cons and we cosplay and we teach our self-coding languages. I'm learning Python and she's been learning Scratch for years. It's a big part of who I am. And so that nerding out has always been a part of me. And I think it directly connects to how I nerd out around AI.

But it's wild because we work at a comm shop. You just talked about what we do. We didn't say anything about AI with all of that. But I feel like the universe has this way of bringing us back to ourselves no matter how it is that we get away from them. And over the years, I've just found myself being pulled willingly, but pulled back into STEAM spaces first with writing about emerging technology for various magazines and websites, and then producing ed tech content for nonprofits. And then as you mentioned, working at Mozilla Foundation where I was a research editor and my whole thing there was examining the real life impacts of AI on real life people. So taking it out of the abstract and out of the lab and really thinking about how AI relates to things like racial, gender and disability justice, and then crafting movement building infrastructure that breaks silos and facilitates cross-movement organizing. Because as you mentioned earlier, if all forms of oppression are disabling, how do we break ourselves out of that oppression if we don't work together to do that?

And so that work really opened my eyes to the ways that algorithms because that's all AI is, and the large language models that they power impact literally every part of our lives. From the jobs that we're able to apply for and secure, to the prescriptions that our insurance plans will cover, because they use AI algorithms to decide whether or not they will cover something, to where we're able to get loans to buy homes, to the art that we make and the art that we consume, which is what we're talking about today. And really once my eyes were open, there wasn't any going back. And that's why here at Disability Culture Lab, I'm so excited that I have the privilege to lead our AI work.

I think the thing that keeps me interested in it these days is the potential. AI has the potential to expand access in really cool ways, but it also has the potential to steal intellectual property. It has the potential to help disabled creators deepen their arts practice, and it also has the potential to lock folks out of being able to protect their intellectual property from being trained or from having systems trained on it, but also from other folks just jacking their style. And the reality is for better or for worse, the ways that AI is deployed at scale, it's not just a technical issue, it's a human issue. And that's why it's so important to the work that we do. And we know that human challenges require human solutions, which may or may not include technology, but at the end of the day, what I'm really interested in is how we can apply the potential of AI to solving human challenges that make life better for all of us, especially disabled folks.

Meier Galblum Haigh:

I love that. Okay. So I feel like there are going to be some people on this podcast who are listening who maybe don't understand the connection between disability and AI. And I want to ask you how all of this connects to the arts, but before we do that, can you help just explain what is the connection between disability and AI, and then how does all of that connect to the arc?

Kenrya Rankin:

Yeah, of course. So just like folks are talking about it in all these other spaces, AI is a really hot topic in the disability community right now, and especially when we look at the intersection of disability and the arts community. We here at DCL, we're in the midst of doing multiple, what we call North Star visioning processes, just kind of our bread and butter. It's a thing that we offer to community groups, to entire movements, to organizations that are either trying to start up or are trying to find their way to help them figure out what winning looks like on their issues. And so we've done several processes over the last year, and we've brought together dozens of artists, creators, organizers, leaders across the disability community, and also leaders across other social movements, because again, we are all about breaking silos. And so we've been talking about what winning and liberation look like when it comes to AI and labor and disabled people.

And of course, that means the AI has come up a lot in those conversations. We've been seeing it. We had about 400 people apply to our brand new fellowship, Disability Rising, which we launched at the end of last year. And we did not use AI to evaluate those applications, but we saw it very clearly used in several folks' applications. We've seen it in concepts that artists have submitted for things that they wanted us to make as part of our streetwear brand. And actually it was those concepts that we received from artists that really served as a first sign to us that we need to create an internal policy about how we engage with AI facilitated work when it comes to working with disabled artists. It wasn't a dot that we had necessarily connected until we had to. And I think it forced us to think about what feels both accessible and innovative while still honoring the work and expertise that goes into creation at that level.

And so for our first two years, we've seen disabled artists who really want to use AI to generate ideas for new projects, but then we've got folks who don't want to consume AI for anything. And you have folks who strike a compromise where they're like, I don't want to use it to create art, but it could be useful for more mundane stuff like inventorying my art supplies, creating art projects shopping lists, things that feel like they're more rote work as opposed to the creative side of being an artist. We've even seen folks who are ... And this is a sentiment that I share and we'll talk about a bit later. Who are very upset about the ways that these AI companies are disabling our communities via the acceleration of fossil fuel extraction. So it's an area that does not come without a lot of nuance. And so I love that because I love when things are not just stark black and white and really force us to not only think about what we want, but to really talk to our community about what it needs.

And so yeah. There's actually a really good example of that. So when we were kicking off the work that sits at the intersection of disability, labor and AI, we had folks who joined the meeting, and these are people who all do work in that space and various parts of it. And several folks logged on using AI note-takers, which show up in Zoom as just their own separate person essentially. And we had to negotiate access friction in real time when there were folks who were in the meeting who were like, I do not feel comfortable having this system in this meeting. I don't know what information it's collecting. I don't know how it's going to use it. I don't have any instruction about how to prevent it from using my information. And so they asked that we take those AI note-takers out of the meeting. And it was so interesting to be in real time figuring out how do you meet the access needs of some folks who rely on those note-takers in order to be engaged in the meetings and to be able to follow up after with the privacy concerns of other folks who are in the meeting who know a whole lot about AI too and are very worried about the ways that it will impact their privacy.

And so it's just been fascinating, honestly, to figure out how do we navigate this space while meeting the needs of everyone? And I don't know that we know that yet. But I do think it's very clear that this technology is moving really fast and that not even just at DCL, but worldwide, policies and organizing have not been able to keep up.

Meier Galblum Haigh:

Yes. I think that's exactly right. And the philanthropic money has not flowed as fast as the tech money has flowed. And so the money for organizing, I know we've seen it's just started being announced this year, but we know that the AI has moved so fast in the last five years, but we just saw these big funds being announced in the last maybe six months even for organizing money around AI. And before that, almost all the organizing that we've seen around AI to try to figure out how social movements respond, let alone how arts organizations respond. I don't think we've seen anything there. Has been entirely off the side of people's desks, which I think is really interesting. And one side note for our audience, just in case folks don't know the term access friction. Access conflict is when you have a one-off situation where you've got two people in a room who might have access needs that conflict with one another. Access friction is when you've got an ongoing community situation where two people's access needs are different than one another, and you need to come up with a consensus decision together about how to exist in community in an ongoing way, despite the fact that you've got different access needs that might be an ongoing friction with one another.

And in the disability community, if you're grounding yourself with disability justice, that's often something that we try to do through consensus and mediation. And so the divide and conquer approach of able-bodied lens would be to say, "There's nothing we can do." But in the disability community, when we see access friction, we often go into problem solving mode and try to say, "Okay. How can we all win here? What's the best way? How can we look at this from another angle and have everyone's needs met?" And AI is a new place where we're seeing this access friction pop up where we're having to try to look at things from new angles and find new ways to approach the problem.

Kenrya Rankin:

Love that explanation. Yeah. It's a lot. It is not an easy thing to do in real time.

Meier Galblum Haigh:

Yeah. All sorts of new places. And we're seeing that same access friction, like you were saying, even just around our streetwear brand where we've got artists who are saying, "Here's this accommodation I need." And folks in their community saying, "I don't want to consume art in this way." And it's like a values friction almost.

Kenrya Rankin:

Yes. I do fundamentally think that's right.

Meier Galblum Haigh:

It's like an access and values friction that we are trying to figure out in our own policies. Yeah.

Kenrya Rankin:

Yeah. I'm wondering, would you want to zoom out a bit for us and talk a bit about the ways that we are engaging with artists big picture?

Meier Galblum Haigh:

Yes. This is a great spot to do that. And then we can zoom back in a bit more on our policies just so folks know what we're talking about when we're talking about these different artists. So Disability Culture Lab has a few different ways that we engage with artists. One of the things that we have, we've mentioned it a couple places here is we have a streetwear brand. And so if you want a visual, you can see it on our website at disabilityculturelab.org or shop.disabilityculturelab.org. But we engage with disabled artists and all of our art is focused on narrative change around disability. We believe that bad stories lead to bad policy. And so we're trying to shift the kinds of stories that are told around disability. And we try to engage with disabled artists, primarily multi-marginalized disabled artists to help imagine liberation and some of the narrative changing ideas that come, especially through our North Star visioning sessions with different ideas that come forward and then visualize them in art. And then we put them on our streetwear brand. So we put them on T-shirts and we put them on stickers and we put them on sweatshirts and occasionally on hats and other things.

And then we put them out into the world and we do partnerships with influencers. And we also do performance art, especially queer and disabled performance art. And that means drag and burlesque and music and comedy. And we try to integrate our streetwear into those in person performance shows as well so that the elements of our narrative shifting visual art is also present in our performance art. And so we integrate that into our in person shows. And then we also do this big picture visioning, which we try to integrate into our written work that we put out into the world. We see that as part of our arts work. And then through our fellowships, we also have arts fellows. So Disability Rising is a fellowship program that we have for rising multi-marginalized leaders. And some of our fellows work at the intersection of disability and the arts. And we're working on rising the brands and profiles of 10 disabled leaders each year, multi-marginalized disabled leaders at the intersection of disability and other social justice issues or areas. And some of those include the arts.

So arts stretches across lots of what we do. And of course, we also have projects that we're looking to get off the ground, which includes more in person art work for visual arts, helping tell disabled histories and futures in cities across the country, especially coming into the 250th anniversary of the Declaration of Independence to see what's being erased. So arts integrates all that we do, to be frank, at Disability Culture Lab, but those are some of the places that we have it. And to be frank, AI has come across all of these areas already. It's come in as we've already talked about in how artists are turning in work to us and how they want to be engaging in the art that they create. It's come up in the way that in the applications that people have put in for funding, artists have put in for funding or fellowships. It's come up in the suggestions for how we engage in access around arts work from our in person art work to the ways that we're engaging artists in those North Star visioning sessions, because we try to engage across a range of the disability community and all of our visioning work. So that includes policy members and artists. And pretty much everywhere else.

There's nowhere that AI isn't coming up right now, as I'm sure anyone can relate to right now. It's like you can't stub your toe without somebody suggesting that you use AI, whether it's your search engine or your neighbor.

Kenrya Rankin:

Yeah. It's surreal. I literally just changed the browser that I'm using on my phone because I figured out how to do it so that I am using something that does not default to AI responses for everything, which makes me really happy.

Meier Galblum Haigh:

I love that for you. I love that for you, but it's very difficult. So you are creating Kenrya Disability Culture Lab's AI Policy for artists collaborations and fellowships, which is ... We hadn't even ... When we started, we just assumed everyone was on the same page, which was such ... Looking back on it, what a ridiculous assumption.

Kenrya Rankin:

Yeah. The bravado.

Meier Galblum Haigh:

The bravado. But when we started just hiring artists to do, for instance, even our streetwear collaborations or putting out calls for artists or bringing in applications for funding, we just assumed everyone in our community was thinking about this the same way. Completely wrong assumption, obviously. How are you thinking about our AI policy now that we have realized that we are totally wrong on that assumption?

Kenrya Rankin:

I think for me, it really comes down to three priorities, protecting intellectual property, expanding access, and avoiding disabling impacts. So a couple of months ago, I got a letter via snail mail saying that one of my books ... This one's called How We Fight White Supremacy, is among hundreds of thousands of titles that were illegally used to train OpenAI's ChatGPT. Yay.

Meier Galblum Haigh:

Rude.

Kenrya Rankin:

Yeah.

Meier Galblum Haigh:

So rude.

Kenrya Rankin:

And so because of that, I now qualify to join a class action lawsuit against that company just as a remedy for unauthorized use of my work.

Meier Galblum Haigh:

To hopefully get paid for the fact that now people can use your writing as they're training their own work on ChatGPT.

Kenrya Rankin:

Yeah. It would be nice. And quite frankly, that's a fear that we heard from several disabled artists throughout the North Star visioning process. This fear that their work, their words, their art would be used without their permission to train these systems without any type of compensation, without any type of transparency into how it's being used. And at its core, this fear is one that really matches one for one for a lot of what it feels like to be in disability community. It's like this fear that you'll be erased. That if you are not outside of your bed, out in society, doing the thing in front of people, that folks will forget that you exist.

Meier Galblum Haigh:

I think that's so scary right now, especially in the midst of the largest mass disabling moment since the Vietnam War, which is long COVID. Obviously we've got multiple mass disabling events now with ICE and state violence. But COVID has sent so many young people in particular on this track of being trapped at home. And certainly postviral illness isn't new. But one thing I think we heard a lot of in the last two years that I think has really shocked me is how many people in their 20s and 30s are just trapped at home and aren't leaving. And even when we were getting applications for a really entry level position, I think something like a third of our applications were people who couldn't take work that wasn't remote, who were entry level positions in their early 20s who were completely homebound.

Kenrya Rankin:

Build a career.

Meier Galblum Haigh:

How do you build a career? I've spent a lot of my late 20s and early 30s and now, and much of my 30s at this point, homebound. But in my early 20s at least was a little bit more mobile and worked a lot of 80-hour weeks, which is probably part of why I'm as disabled as I am now.

Kenrya Rankin:

That part.

Meier Galblum Haigh:

And a lot of 100-hour weeks, let's be honest. But this fear of being forgotten and not having a history, I think we heard so much of and so much of our North Star visioning and it was like, okay, well, if no one sees me, is there an idea I can be remembered by? How do I leave a mark if I don't leave my house is something I think we heard a lot of in our North Star visioning sessions. And I think that we hear a lot of in disability community that might be, I don't think is unique to our community, but I think is more amplified.

Kenrya Rankin:

No. I think you're absolutely right. And I think it's also a really great example of how the big tech companies that sit behind these models that people are pushing at us all the time, where they quite frankly ignore the existing laws that are already on the books that should be protecting artists from things like this, like copyright. That's a really basic foundational law in this country that is being completely ignored with the intention of boosting these technological products as opposed to supporting the people whose work is actually fueling the creation of these products. And that's one of the things I worry about as I'm creating our policy. How do we protect the intellectual property of artists disabled and otherwise marginalized from being used without their authorization and without their compensation? I've seen these systems scrape artists' entire catalogs and then regurgitate some bootleg facsimile of it that somebody's going to just end up texting to their homegirl or using on a party invitation or using for a Zoom background. And I'm just like, yo, you used all those resources to create that, and how does that impact the person who originally created it?

And the reality is I don't have any interest in the partners that we use using AI in ways that would undermine other artists' work. There's always that push and that pull in these conversations. But I will say that I'd also be remiss if I didn't acknowledge the fact that these AI systems, if they are actually built by and for the people who are closest to the harm, they could potentially open up a new world of access for disabled artists and just the disability community at large. But it has to be done the right way, which is where policies like this come in.

You've got artists like Sean ... I think it's Aaberg, and our apologies if I'm not spelling it right, but he's an illustrator, and he uses generative AI image systems. In particular, he likes to use Midjourney to create. And he started doing that after he had a stroke and it left him unable to use his left side and he was left-handed. And so he has found ways to use these systems to help him expand his own work, which I think is something that is a useful tool within our community. And then another way that it could help with access. You've got folks who maybe English is not their first language who use AI systems to help them do things like write grant applications, help pick images from their portfolio to use in pitches. It can be a useful tool when it's used thoughtfully.

As we were talking about a little bit earlier, we've had artists who used AI to create art to pitch for our streetwear brand. We've also seen community members who may love that art, but are really wary of buying or using AI because as you mentioned, it goes against their values. They don't want to consume anything that's made by machines. They don't want to contribute to disabling technology. And so even though we have folks who do use it as an accommodation, we have to work really hard to straddle this line about what is an accommodation and what is just straight up machine created art, and how does that fit into the overall mission of what we are going to do?

Yeah. I don't know. I think there are really large red flags that should be going off for all of us when we see systems that are not actually created by the people who are closest to harm. I want to name them because I think that everybody's not necessarily thinking about them. What happens to the data that you type into ChatGPT? Does it just sit there if you put it in there, or is the system then using that data to further train its model? Is there a potential that another artist could jack your art style and become the next industry darling just off the strength of your hard work? Or if you're an artist who's using these generative AI systems, are you accidentally ripping off somebody else's art style with your new pieces? Or are you doing something that we saw a lot of small creators do when I was at Mozilla, which is take the initiative to create a local AI system that's on your laptop, then that can be trained only on your past work so you can use it to iterate your own approach to your work. And taking an approach like that both protects your data from being used outside of your own space and decreases the environmental impact of gen AI because you're not running like a server farm to do this, it's just right here locally on your machine.

And when you do things like that, you decrease the environmental impact. And then there's also the ways that the ... Oh, go ahead.

Meier Galblum Haigh:

I was going to say, can you talk a little bit more, because I know you said a bit ago that there was also this piece of there's a big opposition to the disabling impacts of AI in our community, and that's a big part of the AI policy that you're thinking about. Can you talk a little bit more about the disabling impacts of AI?

Kenrya Rankin:

Yeah. Absolutely. So there was an article, I want to say it was in 2023, where Time Magazine did this whole expose about how OpenAI was paying data labelers in Kenya less than $2 per hour to find and label disturbing content that it had pulled into its dataset. And that could include anything from hate speech to violence to sexual abuse. And the work itself was so traumatic for the folks who were drafted to do it that they had to shut down the entire operation after a few weeks because workers said that their mental health was deteriorating. That is a really clear example of how working in AI spaces is impacting folks disability. But even closer to home, we're seeing companies that create these generative AI systems follow the same playbook that the gas and oil companies have always used, which really comes down to polluting Black neighborhoods all in the name of advancing technology. It's essentially what we call sacrifice it's own policies where the greater good, which I'm making massive air quotes, because the greater good in this case is just capitalism, is more important than the wellbeing of the people.

Meier Galblum Haigh:

Profit.

Kenrya Rankin:

Yeah. Period. Profits over people is what it comes down to. And so we've seen folks like Elon Musk, he built a data center. He has this AI company called xAI, and he built a massive data center in South Memphis, Tennessee, which is a predominantly black neighborhood. It's already suffered from factories being built there that pollute the air. But we know that his data center is using enough energy to power a hundred thousand homes every year. And in order to meet those energy needs, he's running methane gas turbines all over South Memphis. And those turbines spew chemicals into the air just like you would imagine that they do. And just in these last couple of years, the city smog has already increased by up to 60% in some areas, and it's negatively impacting the health of the Black folks who live there. This really goes back to the fact that all forms of oppression, and in this case, environmental racism are disabling.

So as I think about our policies, it really feels like the important thing is to strike this correct balance between increasing access and mitigating harm. Because ultimately, if disabled creators are going to be using generative AI to make art and to improve access, that cannot be done at the cost of our health and our safety.

Meier Galblum Haigh:

Yeah. And I would just add, if folks are looking to read more about that, Capital B News has just done such a great job of leading the way on reporting on the incredible environmental racism around expanding data centers across the country, which are almost entirely in black and brown neighborhoods nationwide, and especially in black neighborhoods nationwide with effects on electricity costs, huge effects on water costs, and even actually wiping out entirely water supplies in a lot of communities, clean water supplies in a lot of communities. And then also like Kenrya talked about already, clean air. But I think a lot of times there's this public health angle that is reported on with this idea that, oh, you're a little bit sicker, but it's not really that you're sicker for a short period of time. These are lifelong disabling costs that individual households and communities bear. And the disability tax is so ... It's a lifelong tax. It's not something that goes away. It changes how you work. It changes where you can go to school. It changes your education. It changes so much in terms of your access to everything.

And then you see all these stories come out about shaming people for using wheelchairs in airports, shaming people for needing accommodations at school, the ways that being disabled impacts your entire life. And these companies are allowed to externalize these costs while bringing in these record profits. When do they pay and when do we? And yet again, these individual communities are being put on the line to subsidize these bonuses for folks like Elon Musk. And what does it mean when the point of art is to express the human condition to use art to subsidize at the end of the day, the bonus of somebody like Elon Musk? And especially for us, when our art is meant to be expressing liberation.

Kenrya Rankin:

Yeah. It's real.

Meier Galblum Haigh:

Ooh, that is really tough.

Kenrya Rankin:

It is.

Meier Galblum Haigh:

That is a tough policy to try to tread.

Kenrya Rankin:

And that's where we find ourselves.

Meier Galblum Haigh:

And that is where we find ourselves. So anyhow, so we mentioned that we're in the midst of this visioning process at Disability Culture Lab. We've brought together all these amazing thinkers from across AI ethics and labor, the disability community, the arts community on what winning actually looks like at this intersection in the future of work and AI and arts, all these different places. Is there anything that surprised you or that changed your mind?

Kenrya Rankin:

We have been having a lot of fun over these last few months as we partner with these more than a dozen groups, as you mentioned, to dream together. I can't say that anything changed my mind because I try to approach this work holistically and think about the nuance and try to sit in the uncomfortability of all of that. But there were definitely some moments that surprised me. Big picture, we spoke with folks who were much more open to using AI than I expected, including artists. But I will say that they were all very clear that the right conditions need to be in place in order to make AI a tool that does more good than harm for our community. So even the folks who ... I can't say anybody was banging the AI drum, but folks are like, okay, if this, then that.

And so we had some really interesting things come up about what winning looks like. Some folks suggested that we make a user owner a legal status in this country where people who opt to use models, they actually own their data that is put into them to train them, but also any exported content, and also that they have voting control over how data is used as a whole from those systems, which I thought was really interesting. We also talked about reparations for folks who've been harmed by this technology and that OpenAI class action suit is really an example of something like that and how something with that might look. We delved into the practicality of developing and using AI intentionally to disrupt injustice and enable fair conditions rather than just trying to thwart the bias that we know is baked into it because human beings are biased and AI is created by humans. And so it carries all the bias that we know folks walk around all day with is also built into these systems is one of the things that can make them really harmful.

But the idea of building a disability community AI marketplace that's just full of local level solutions, which as I spoke about earlier, is a really great way to get around the environmental impacts if you're just using something locally on your own machine. And that disability community AI marketplace would have apps and tools that really are community created, community owned and community controlled and really designed for our groups to benefit from AI in really concrete ways. So yeah, honestly, it's just been a really enlightening process. I'm really excited to share our findings when we finish synthesizing everything with everyone. It's pretty cool. I'm excited to be part of it.

Meier Galblum Haigh:

I think that probably can be extrapolated across other communities in terms of closed community or closed individual large language models. What does it mean to be able to allow folks more funding or more space for creativity where the ultimate beneficiary isn't an Elon Musk or a billionaire, but is the community that is creating the product. So that's a he takeaway.

Kenrya Rankin:

Yeah. It feels like a really interesting space for funders to specifically meet the needs of folks on the ground. So I'm excited for what that might lead to.

Meier Galblum Haigh:

Yes. So as we're thinking specifically about disabled artists through partnerships, fellowships, hopefully micro-granting in the future, or for folks listening, maybe even large granting in the future, let's hope we're moving a lot of money to disabled artists.

Kenrya Rankin:

That part!

Meier Galblum Haigh:

Do you think that there's anything that folks should be keeping in mind when it comes to AI policy? Where and when do we start or stop talking about AI in these processes?

Kenrya Rankin:

Yeah. That's a really, really great question. I think that something that has become very clear during our work, this North Star visioning work that we've been doing at this intersection of disability and AI and art and labor is that techno solutionism is not going to save us. And that essentially for those listening, that's the idea that all of our problems can be solved if we just land on the right technology. But I think TV shows like Black Mirror have shown us that that's not always true. Real life has shown us that that's not always true. But more importantly, life has shown us that many of the problems that folks are actually seeking to solve via AI are really failures of the state.

Why are folks feeling like they need to use AI to do applications or to apply for something like Social Security, disability insurance because the administrative burden is just too hard to bear on top of the cryptax that we already pay that makes it difficult for us to move through this world because it's not built for us. Why is the government under counting disabled people during the census, which then directly impacts the resources that are available for us? Why are there inaccessible voting policies that are out here that prevent us from being able to weigh in on the laws that impact our lives just like everybody else?

Before we really delve into how tech can supposedly help us solve all our problems, I think we really need to overhaul the systems that make us reach out for this untested digital solution to begin with. I think we also need government leaders who know that eugenics as an organizing principle is not a great one. We need unions that are actually built to protect disabled workers specifically. We need a government that actually treats disabled people like humans who are worthy of care, regardless of how much we can or cannot produce under the system of capitalism. And we need employers who value the workforce contributions of disabled people just as much as they value the contributions of other workers. And then after that, I think that maybe we are in a place where we can use tools like AI in ways that are actually supportive of our communities across all sectors. The reality of it is the most effective systems ... And I feel like a broken record, because I think this is the third time I've said a variant of this. But the most effective systems are those that are built by the people who are most impacted by them. So any AI policies and tools that we build here that are built across the disability community or that are built specifically for disabled artists and other workers should be created in that same way.

They need to be created via a co-design process that takes into account the actual needs and the impacts on the ground, not just in some lofty world that we hope that we see, but in the world that we make. And then those systems need to actually be built by disabled people who both have expertise and live experience to integrate into those systems. And that I think is how we end up with AI that honors our creativity, that honors our labor, that honors our body, minds, and the limitations that come with all of those things. That's how we get to where we want to go.

Meier Galblum Haigh:

I love that. And I feel like we need art that helps explain all of that.

Kenrya Rankin:

Yes.

Meier Galblum Haigh:

So definitely to make it happen.

Kenrya Rankin:

Yes.

Meier Galblum Haigh:

I feel like art is how we help see, feel, smell, believe liberation actually exists. Because without it, I just don't think any of it is real. Art helps us believe all of these things are possible for a moment in time. I think it gives us hope that all of these things are possible for a moment in time while we're experiencing it. So when we say we need government leaders to do X or employers who value disabled people as much as other workers or disabled people who can't work as much as people who can, I don't think that's possible without art, to be quite frank. I think we need art as a part of how we tell different stories. So what's next? What happens after this?

Kenrya Rankin:

We going to tell those stories. Next up, we're going to publish the findings from all of this North Star visioning work that we've been doing at this intersection, and then we're going to use that to keep refining our policies and to keep organizing across the disability community, but also just with other folks in other movements because all of our liberation is tied. And so whatever data, putting on my nerd hat, whatever data we can uncover to be able to move us forward in that direction, that's what we are excited to share with the community.

Meier Galblum Haigh:

I love that. And I would just say if you're listening, talk with friends about your AI policies as well. If you think that you've got an old AI policy, talk with friends because we all need better AI policies. We all need better AI organizing. We need better AI organizing at this intersection of the arts. So I think we could all be doing this better because it's new to almost every organization and almost every sector right now. So I would say that's a big part of what's next too. But as of today, I would say thank you so much for joining us and a huge thank you to Grant Makers in the Arts for having us at Disability Culture Lab and allowing us to join in your podcast today and to the Grant Makers in the Arts community for letting us join in. We are deeply grateful to be in community with y'all. This is Disability Culture Lab joining y'all and now signing out.

Jamie Sharp:

Thank you so much for listening to this GIA podcast episode, and thank you to Meier and Kenrya from Disability Culture Lab for joining us today. You can check out more of our podcasts on reader.giarts.org. Catch you next time.


ABOUT THE SPEAKERS

Meier Galblum Haigh is the founding Executive Director of the Disability Culture Lab (DCL), a nonprofit disability media and narrative lab with a mission to shift the narrative on disability from fear and pity to solidarity and liberation. DCL engages in media, arts, and public education to change the culture and narrative around disability and win material improvements in the lives of disabled people. Meier is a trans-masc, nonbinary disabled person with nearly 20 years of experience at the intersection of movement building, media, and communications. Born and raised in Fairbanks, Alaska, they have proud rural roots but now live in Washington D.C. with their spouse, elementary schooler, and pup.

Kenrya Rankin (she/her) is Managing Director for the Disability Culture Lab and an award-winning author, communication and narrative change strategist, journalist, editorial consultant, and disability justice advocate. A Black, disabled, queer woman, Kenrya creates dynamic, high-impact content that amplifies the lived experiences, advocacy, and labor of people of color to shift the narrative around who deserves liberation, joy, justice, and dignity. A 25-plus-year editorial veteran, her work has appeared in dozens of national publications, including The New York Times, Reader’s Digest, Ebony, Fast Company, and Redbook, and has been translated into 21 languages. And she has authored five books, including How We Fight White Supremacy: A Field Guide to Black Resistance. Kenrya most recently served as research editor at Mozilla Foundation, where she made technical topics accessible for all and built infrastructure at the intersection of AI and social justice. She previously served as senior editorial director at Colorlines (published by nonprofit RaceForward); on staff at Reader’s Digest, Latina, and Uptown Magazine; as senior advisor at Megaphone Strategies; and as contributing editor at ShopSmart, among other roles. Kenrya holds a bachelor of arts degree in journalism from Howard University and a master of science degree in publishing from New York University. A proud native of Cleveland, Ohio, she lives in Washington, D.C., where she enjoys Beyoncé dance parties and belly laughs with her kid.

Grantmakers in the Arts GIA

Grantmakers in the Arts is the only national association of both public and private arts and culture funders in the US, including independent and family foundations, public agencies, community foundations, corporate philanthropies, nonprofit regrantors, and national service organizations – funders of all shapes and sizes across the US and into Canada.

https://www.giarts.org
Next
Next

The Courage of Imagination, A Pro-Democracy Movement, and the Civic We