CTRL + ART + APPLY: Navigating AI Ethics, Access, and Integrity in Grantmaking
Anna Tragesser, Indy Arts Council, Indianapolis, Indiana
In October 2025, arts grantmakers packed into a crowded conference room at the Royal Sonesta Minneapolis to discuss AI ethics, access, and integrity in grantmaking.
This panel discussion, hosted by Grantmakers in the Arts’ Support for Individual Artists Committee, firmly emphasized the artist and applicant perspective on the use of AI in grant processes. Panelists unpacked opportunities and pitfalls for artist applicants who use AI in artistic creation, grant application, and grant review; they aimed to generate awareness of the artist perspective for grantmakers who are navigating this new ground.
TLDR: In a reality where funders and grantmakers will increasingly encounter new opportunities to incorporate AI into their workflows, they’ll have to apply their own ethics to the specific situation at hand to prioritize humanity, access, and stewardship.
AI makes both artmaking and grant writing more accessible for many people. Expect that applicants will use AI to prepare applications and/or in their creative and administrative processes.
Clarify your own priorities related to AI as a grantmaker, and be transparent about them. It’s okay to set guidelines and parameters for applicants’ use of AI, but it’s especially useful to explain the reasons behind the rules. If grantmakers care about applicants using AI, they must educate themselves on it.
Steward a submitted grant application with care, as it contains precious intellectual property and potentially sensitive information.
Aim to preserve the deeply human experiences and tasks. This means keeping humans involved in grant decisions.
Although this panel didn’t specifically focus on the ecological impact of AI, it can’t be ignored. It’s a critical perspective when applying personal ethics to each scenario.
Meet the panelists:
Koven Smith (he/him) - Independent Consultant, previously with the Knight Foundation
Teresa Haradaway (she/her) - Owner of design studio Blackbird Revolt and Black Garnet Books, Associate Professor of Graphic Design, and the Director of Design Justice at the University of Minnesota.
Tim Brunelle (he/him) - Writer, creative director, Minneapolis College of Art & Design, University of St. Thomas AI Institute for the Common Good, consultant
Dee Harris (she/her) Director of Open Culture Storytelling, Creative Commons
Zoe Cinel (they/them) - Interdisciplinary artist, curator, instructor at Metro State University
The panel explored three scenarios in which an applicant might encounter AI in the grantmaking process, and how an artist might respond:
When artists use AI: You’re an artist who employs AI techniques and methodologies (Generative AI, Machine Learning, etc.) in your practice, and you are working on a grant proposal. The grant application does not specify whether the use of AI tools or AI-generated work is acceptable or not; it is silent on the matter.
When AI is prohibited: You are an arts practitioner applying to a grant program, and the grantmaker has explicitly banned “AI use” in the creation of applications, without additional clarification.
When AI is used to evaluate: You have submitted an application to an open call program. You later find out that the grantmaker used ChatGPT to review at least some of the applications that were submitted. This was not disclosed in the grantmaker process or application materials.
Juicy scenarios, right? Let’s dive in!
When Artists Use AI in an Application Process
Scenario: You’re an artist who employs AI techniques and methodologies (Generative AI, Machine Learning, etc.) in your practice, and you are working on a grant proposal. The grant application does not specify whether the use of AI tools or AI-generated work is acceptable or not; it is silent on the matter.
As an artist applying for an opportunity that did not specify if AI could be used or not, Zoe said, “I would use it, and I would expect to not be juried on that.”
Zoe provides three examples where an artist might use AI in an application process:
In a hypothetical digital illustration competition, perhaps an artist creates an illustration solely by using text-to-image generation and applying no additional edits. Zoe says this feels like a shortcut. These platforms have a specific aesthetic that’s easy to spot. Plus, there are questions of authorship (the Copyright Office currently does not recognize works not created by humans as copyrightable). If the artist were asked to write about the artistic process, they wouldn’t have much to write about. Zoe suggests that the artist should consider: Does it also demonstrate professionalism to the jury? Can it even be considered juriable in an illustration contest?
An artist in the field of new media may make work that uses computer vision, computer generation, or any other AI tool. The artist uses it to critically engage with the process of generating images, and to expose its flaws or celebrate its qualities and possibilities. They spend a long time making the work and thinking about it. The artist may mention the use of AI in the grant writing process. In this case, an artist may wonder: Are the jurors informed enough to review my application? Will the jurors have biases because they are not informed of the processes of AI?
A photographer makes surreal photographs. Instead of having to edit out small details of images, they can use the tool embedded in the software to make edits that would manually take much more time. The result is the same. The artist may question: Do I even need to disclose the fact that I used AI in this way?
As Teresa Hardaway pointed out, AI is a tool just like a paintbrush is a tool. In fact, many seasoned graphic artists will remember that the introduction of Adobe tools caused a similarly frantic fervor. However, it matters whose hand is on the paintbrush: your skill in using the tool impacts the outcome.
Because AI can be assistive, collaborative, or generative, Dee Harris suggests: “Ask the artist: Where is your hand in this?”
AI could appear anywhere in the application process, and artists will need clarity about grantmakers’ stance on using AI. Grantmakers should proactively share their own values.
“Funders need to be clear. In order to be clear, you have to be fluent,” says Tim Brunelle. “Are you funding the raw idea? The craft? A vague idea like innovation? AI falls into all those ideas in different ways. If you care about the use of AI, be clear about what you mean. Funder fluency is the most important thing.”
But funders don’t have to have definitive answers. “Investing in educating staff, asking and testing questions, shows that grantmakers and institutions are building trust with artists. Trying to understand their process builds trust.”
Image from the AI session at GIACON25.
If grantmakers aren’t transparent about their stance on the use of AI in an application process, artists may hesitate to be fully transparent about their own process.
“What happens when a foundation makes a judgment based on the use of AI?” Koven asked the panel. “Do you have concerns that disclosing or not disclosing the use of AI may harm your application?”
Teresa responded, “I’m thinking about all the other ways that bias shows up in the application process. Racism, sexism, and ableism always show up. People will have this bias whether it is disclosed or not. There are more things that would affect someone’s bias on my application than the use of AI.”
In the same vein, perhaps grantmakers should consider that anti-AI bias may color their own judgment. “When you’re not sure how to react to the use of AI, talk to the person. Learn about how the use of this technology supports (or doesn’t) what they’re trying to do,” says Zoe.
When AI is Prohibited in an Application Process
Scenario: You are an arts practitioner applying to a grant program, and the funder has explicitly banned “AI use” in the creation of applications, without additional clarification.
To summarize the panel’s resounding reaction: Not cool, bro.
Both grantmakers and artists on the panel acknowledge that artists should spend their limited time creating, and believe that AI can help.
Teresa prompts artists to push back on a blanket AI ban and ask, “What do you mean by AI?” Do assistive writing tools like autocorrect count, or do grantmakers mainly have a problem with generative tools?
“There are questions of ableism, access, and equity,” Tim points out. “If a person doesn’t have access to a computer, broadband, or an Adobe license, but they can create something in a way they’ve never created before, who are we to say no? We need to consider the rules we’re going to enforce, and if we can really imply that the tools aren’t being used in the ways we think they should be.”
Panelists suggested that if a grantmaker did provide clarity on their AI policy, it might look something like this:
Instructions in plain language.
Examples of what is or is not permissible, even if they can’t be exhaustive examples.
The intended outcome of the grant opportunity so that an artist can understand if and how AI might help accomplish it.
When AI is Used to Evaluate Applications
Scenario: You have submitted an application to an open call program. You later find out that the funder used ChatGPT to review at least some of the applications that were submitted. This was not disclosed in the funding process or application materials.
Some grantmakers are pretty uncomfortable with this scenario, and it seems unlikely that grantmakers are currently using AI technology in this way. But as technology evolves and staff capacity is stretched, grantmakers may be forced to find ways to lean on AI assistance.
Panelists bounced around several perspectives for grantmakers to determine what influence AI should have in their grantmaking processes.
Most importantly, keep humans involved.
AI’s comprehension has limits. It carries its own biases. It may not be able to understand nuances, nor how concepts fit or challenge cultural and societal contexts. Dee urges us to hold on to the experiences and tasks that are deeply human. “Use your imagination to understand the meanings behind the words, seeing where the ideas came from.”
“Use your imagination to understand the meanings behind the words, seeing where the ideas came from.”
“Artists use very specific words to describe their work,” Zoe explains. “Sometimes a word has been reclaimed. That process takes time and iterations of people sharing it. For example: the word ‘crip.’ What does AI think about that word? I don't know, but people have written books reclaiming that word. The first person who reclaimed that word probably would have been clocked by AI as biased. Now people see themselves in the word ‘crip.’”
Teresa wonders: “If you need to use Chat GPT to go through these applications, can you set aside some of the funds to help you out?”
Some panelists discussed possibilities for using AI in the review process, with caveats. Primarily, be transparent.
“Disclose it,” Zoe implores. “This goes back to the idea of trust. It depends on what kind of AI was used and how it was used, but it’s possible that someone’s material will be harmed, shared, or used in a way that it wasn’t supposed to. Artists would be less likely to participate in your program in the future.”
“Be as transparent as possible,” said Tim. He provided some specific examples of what this transparency could look like:
Just like a land acknowledgement, include an “AI acknowledgement” on your website. This will help artists understand how fluent the organization is in AI technology and how it reacts to the use of AI in different parts of the conversation. This helps the artist be more transparent with the organization.
If you know you’re going to use an AI tool in the review process, disclose it in advance of the application process. Include the prompt that will be used. Not only does this assure artists that they will be judged by consistent criteria, but it also ensures that they can evaluate if their intellectual property and sensitive information will be stewarded with care by both the grantmaker and the technology.
Which, of course, brings us to the million-dollar question: what about copyright? At the time of this discussion, US courts are still sorting out the limits of copyright for AI-generated content as well as litigating heinous copyright violations by a number of AI tools.
This is Dee Harris’s particular area of expertise at Creative Commons, so take her advice to heart: grantmakers shouldn’t expect US policy to keep up with ethical issues in the ever-morphing, international AI landscape. When it comes to AI, each of us must apply our own ethics to the specific situation at hand. Use your brain and heart to steward the grant process.
ABOUT THE AUTHOR
Anna Tragesser Anna helps organizations and artists access grant funding and resources for their creative work and wellbeing. Connect with Anna if you are curious about grant opportunities or are seeking connections to help your ideas soar. Anna’s Indy inspirations include Kheprw-Institute’s Cafe Creative community and COMPANION. She finds peace in enjoying rural life, quilting, building kites and sharing stories.