With the growing use of AI, campus officials are trying to set clear guidelines for college application essays.
Artificial intelligence might be the new frontier in technology, but Toby Reed, a senior at Fremont High in Oakland, has no doubts about whether to harness its powers — at least on his college application essay.
“No. It’s blatantly plagiarizing,” said Reed, who, like hundreds of thousands of other California seniors, is in the process of applying to colleges. “It’s bad enough stealing content, but with ChatGPT you’re not even stealing from a real person.”
In the first application season since generative AI tools like ChatGPT have become widely available, colleges and high schools are grappling with the ethical and practical implications of text-writing technology.
“We can’t pretend it away,” said Josh Godinez, a high school counselor at Centennial High in Riverside County and former president of the California Association of School Counselors. Students are using AI on their college application essays, whether grown-ups like it or not, he said.
Most school leaders and college experts that CalMatters interviewed agree that students who rely exclusively on AI to write their college application essays are violating academic integrity rules and are subject to having their applications rejected. But there’s plenty of nuance in the details, and guidelines can be vague and confusing.
The California Department of Education encourages districts to explore the potential benefits of AI, particularly in computer science curriculum or as part of broader lessons in media literacy. But it leaves decisions about AI use in classrooms up to school districts — many of which have policies prohibiting plagiarism, which could include the use of AI for writing essays, for example.
That means most students applying to college now are at least familiar with the ethics of using technology to write their essays for them.
“We want our students to understand how AI works and how to leverage it, but also understand the ethical implications,” said Katherine Goyette, the state education department’s computer science coordinator. “AI is here. We need to teach students and educators how to learn with it, and learn about it.”
And even if colleges prohibit essays whose provenance is generative AI, nabbing a student for robotic plagiarism is an imprecise science. The company behind ChatGPT shut down its own tool for detecting text generated by AI in July, citing a high rate of human-derived text that the application flagged as written by AI. One scholar in a Wired article noted that even a 1% rate of false-positives is inexcusable, because for every 1,000 essays, that’s 10 students who could be accused of an academic theft they didn’t commit.
JR Gonzalez, chief technology officer for the Los Angeles County Office of Education, noted that no AI detection tool is 100% accurate. And AI itself can occasionally produce wrong information.
Varying policies on AI in admissions essays
Common App, the college application tool used by 1,000 institutions nationwide, in August included a restriction on “substantive” AI use in college admissions applications as part of its fraud policy. The addition was a response to feedback from member colleges and an internal desire to “keep up with the changing technologies,” a spokesperson wrote.
What does “substantive” mean? Common App’s CEO, Jenny Rickard, said there’s no definition, and that’s intentional, writing in an email that “we will evaluate the totality of the circumstances to determine if a student truly intended to misrepresent content generated by AI technology as their own work.”
Common App doesn’t determine whether students are being honest — that’s up to the member colleges to figure out. But if Common App concludes that a student plagiarized, that student’s account may be terminated and Common App will notify the campuses to which the student applied, Rickard wrote.
University of Southern California, which uses the Common App exclusively to process its admissions and is a top choice for Common App applicants, is less lenient.
“Were we to learn that an applicant had used generative AI for any part of their application, their application would be immediately rejected,” the university said in a written statement. The school turned down CalMatters’ request to interview an admissions official.
But the stern words lack teeth. The highly selective private university isn’t employing any AI-detection software, a spokesperson wrote.
The University of California and its nine undergraduate campuses permit students to use generative AI in admissions essays in limited form, such as “advice on content and editing,” but “content and final written text must be their own,” its written policy states. Unlike the state’s private campuses, UC operates its own admissions portal.
But the UC Office of the President, which turned down a CalMatters request for an interview on the topic, wouldn’t specify how it detects whether students relied on AI tools to write their essays. “UC conducts regular screenings to verify the integrity of the responses, may request authentication of the content or writing as the student’s, and will take action when it is determined that the integrity of the response is compromised,” including plagiarism through AI, its guidance states.
A UC spokesperson, Ryan King, suggested students are wasting their effort by relying on AI generative tools, writing “it would be more work for them to try building a strong ChatGPT prompt than it would be to develop their own original responses to the (essay questions).”
Campuses that responded to CalMatters indicated that while generative AI can be a source to spitball ideas, structure an outline and generally shape the essay-writing process, the tools are no match for human voices to communicate nuance and how an applicant’s life experiences tie into the various essay questions. There’s also limited room for the banal writing AI tools typically generate — the Common App essay response can’t exceed 650 words while UC’s four essays are capped at 350 words each.
Pomona College, a highly selective institution that accepts applications through the Common App, has no formal policy on AI use in admissions essays, though its director of admissions thinks it’s “not very good at nuance, personalization or helping a student communicate in their authentic voice, which is what we’re really looking for when we evaluate an application,” wrote Adam Sapp in an email.
“The value of a 350-word response on topics like leadership, resiliency, or creativity may be diminished if it doesn’t directly reflect a student’s own experiences,” noted UC Riverside’s director of undergraduate admissions, Veronica Zendejas, in email.
Responses from Stanford University and UC Berkeley relayed similar sentiments.
Zendejas offers practical tips for crafting anxiety-inducing essay responses, telling prospective students to “write in clear, straightforward prose, much like they would in an interview or a conversation. This approach should alleviate concerns about the need for AI tools to assist in writing their responses.”
University of San Francisco, another Common App partner, won’t use AI detection software for college applications because the campus doesn’t think it’s necessary. The full picture of a student’s fit on campus comes into view from their grades, letters of recommendation and other aspects of the holistic application review, said the university’s associate provost who oversees undergraduate admissions, Sherie Gilmore-Cleveland, in an interview.
Gilmore-Cleveland said after a student’s high school academics, the essay is the second-most important factor in a student’s application at University of San Francisco. But in her 20-plus years of working in admissions, she’s never encountered a student with weak grades and a strong essay who was admitted. Other admissions officers have also questioned how much of a boost an essay gives an applicant.
However, a student with good grades and an awful essay may be rejected from the university if they’re trying to apply for a competitive major. The student may be re-routed to another major, or just be rejected outright — it’s a case-by-case basis, Gilmore-Cleveland said.
Using generative AI is easy
But not everyone applying to college with an essay component is a good writer, said Jeffrey Hancock, a Stanford University professor of communication. “They’ll probably find that they do better when they use a tool like this,” Hancock said of applicants using generative AI.
Hancock said students with no coding experience can train a tool like ChatGPT, especially the latest premium version, to generate strong essays, in a process known as fine-tuning.
First, an applicant pastes essays of students who were admitted to top colleges into an AI tool. The student tells the tool to analyze the essays for positive traits. Then, the applicant pastes essays from students who were rejected from schools and prompts the AI to look for patterns to avoid. Along the way, the student confirms the AI tool is understanding the task. “Do you understand the difference between the two, and it would say ‘yes, I’ve found this pattern versus that pattern,’” Hancock said.
Finally, the student prompts the tool to generate a rough draft based on those findings.
Hancock co-published a peer-reviewed study in March showing that humans can detect AI-written work about as accurately as predicting a coin toss — meaning poorly. And “as you build detectors, the AI gets better,” Hancock said, adding that he anticipates an arms race between detection and evasion.
And while generative AI may be the latest cause célèbre, it’s part of a long line of help students have been able to access for decades. Teachers, counselors and family members offer students writing support. So can pricey tutors, who — even if they’re ethically opposed to writing an essay for a student — can still provide tailored coaching in a way that’s inaccessible to most low-income students.
The debate over AI use in college applications reflects a larger trend in classrooms. Educators are deciding how to adapt to artificial intelligence, especially as it improves and becomes more ubiquitous. Some districts have yet to address the issue, while others have adopted comprehensive guidelines promoting its benefits and warning of its dangers.
The Los Angeles County Office of Education held an AI symposium last summer for hundreds of educators, and is crafting guidelines for the 80 districts it oversees. Despite AI’s obvious risks, the most obvious benefits, according to Gonzalez, are for teachers and administrators: creating lesson plans, making master schedules, tracking student achievement and attendance, writing grant applications and even crafting state-mandated accountability plans.
Christine Elgersma, senior editor for learning content strategy at Common Sense Media, a research and advocacy nonprofit, suggests that schools move forward thoughtfully as they create AI policies and include students in the discussion. Students should understand the ethical implications, the biases that exist in AI algorithms, the potential for misinformation and the privacy risks.
“Since college essays are so personal, it brings up a question of privacy,” Elgersma said. “For example, pieces of your story could turn up folded into another student’s AI-generated essay.”
Students should also understand the value of learning to write, and think, independently, “developing your own ideas and expressing yourself in words, with clarity and profundity and a flair that’s your own,” she said.
Tara Sorkhabi, a senior at Monte Vista High School in Danville, said her teachers have been clear in discouraging, if not outright banning, the use of AI for writing assignments. While Sorkhabi has found AI useful in studying chemistry, for example, she does not believe students should use it for college application essays.
“Admissions officers wouldn’t know who they’re accepting. They’d basically be admitting a bot,” she said.
She also thinks that allowing AI in college application essays is unfair to students who toil for weeks perfecting their own essays without the help of machines.
Reed, the Fremont High senior, said students who over-rely on AI for writing assignments are ultimately cheating themselves, because they’re not learning valuable skills like research, expression and critical thinking.
“It’s your future,” Reed said, noting that students should take advantage of opportunities to expand their minds, not use short-cuts. “You can’t plagiarize in school. You can’t do it at work. People like AI because it’s quick and easy, but it’s not good.”
more on artificial intelligence
While some California college professors remain concerned about students using generative AI such as ChatGPT to cheat in class, a growing number are choosing to encourage its limited use in classroom assignments. From analyzing films to writing research proposals, the assignments seek to convey the benefits of AI as a research tool while acknowledging its…