College essay
resources
Create your Prompt account and get free resources to help you write strong college essays.
Create account

Should you use ChatGPT for your College Essay?

Get Started with College Applications

Should you use ChatGPT for your College Essay?
Brad Schiller
read
Should you use ChatGPT for your College Essay?

I know what you’re thinking. ChatGPT is good at writing. Therefore, why not use it for your college essays? 

But this article is here to convince you not to do it – especially if you’re applying to selective colleges.

First things first, let’s admit that — yes — ChatGPT can write pretty good college admissions essays. In fact, it writes better than many (or even most) applicants:

  • The language is engaging and flows well. 
  • It uses a solid structure that’s easy to follow. 
  • It contains buzzwords related to what colleges are looking for.

There are two problems with using ChatGPT or another AI to write your essays.

  1. It writes in the AI’s voice – not yours. Not only will the essay not sound like you, but it’ll sound like all other AI-written essays, which are fairly easily detectable by humans and AI checkers.
  2. Your goal isn’t to write essays that are better than many (or even most) applicants. Your goal is to write essays that are better than nearly all other applicants
An example essays shows why ChatGPT doesn’t work for college admissions; ChatGPT writes in its own voice – not yours; ChatGPT doesn’t understand what admissions officers are looking for in essays; What ChatGPT writes (instead of what admissions officers want); How the example essay would fail with admission officers; ChatGPT fills in gaps with fluff – not compelling, truthful content; How you can use ChatGPT to help you – with the right prompts
An example essays shows why ChatGPT doesn’t work for college admissions; ChatGPT writes in its own voice – not yours; ChatGPT doesn’t understand what admissions officers are looking for in essays; What ChatGPT writes (instead of what admissions officers want); How the example essay would fail with admission officers; ChatGPT fills in gaps with fluff – not compelling, truthful content; How you can use ChatGPT to help you – with the right prompts

Essays are crucially important — much more so than most students realize. That’s because selective colleges use essays to differentiate between tens of thousands of academically similar applicants. In fact, our analysis that came out of Harvard admissions litigation shows that strong essays improve admissions chances by 10x at Ivy and equivalent colleges.

Below, we’ll use an example ChatGPT-written essay to show why AI currently fails at this task. At the very end, we’ll show you a few techniques that you can use to have ChatGPT strengthen essays that you yourself write. 

But instead of reading this article, we suggest working with a human writing coach. Prompt’s Writing Coaches have helped over 30,000 students achieve their college admissions goals. Relying on ChatGPT as your coach may be better than nothing (or than relying on Aunt Gertrude) – but it’s not going to help you differentiate your essays from the applicants you’re competing against.

An example essay shows why ChatGPT doesn’t work for college admissions 

Let’s start by looking at an essay ChatGPT wrote. It responded to this prompt that we provided:

Note on prompting ChatGPT — generally, the more detail you provide, the better ChatGPT will do.

Here’s what the paid version of ChatGPT (the GPT-4 model) gave us in return:

Wow. When you first look it over, it seems like a pretty good-sounding essay. The language is easy to follow, the flow is engaging.

But the essay fails on closer inspection. The content is poor - "These experiences have taught me a great deal about myself and others. In Mexico, I learned the immense power of patience and persistence." It sounds good, but doesn't give the depth college admission readers are looking for. They don’t want to hear that you learned something, they want you to demonstrate it. Plus, the voice is clearly AI (“the azure-blue day of my departure,” “my belief in the transformative power of education,” “igniting a flame that has grown into a full-blown passion”).

Let’s dig in to show exactly why and how this kind of AI-written essay will let you down.

 (Note: if you want a few other examples of ChatGPT-written college essays, this article also has a few, with the same issues that we talk about here.)

 1. ChatGPT writes in its own voice – not yours

Your voice is a huge part of your college essays. You’re telling your story related to your most compelling experiences that prove you’ll be successful in college and beyond. An admissions officer is using your essays to picture you as a member of the campus community and as an alumni. They can’t do that if they don’t get a sense of your personality and how you think.

ChatGPT’s voice is not yours. As we said, it’s a mix of all examples of “good” college essays from across the internet. You may think ChatGPT’s writing sounds good (not wrong). But it’s not you, and admissions officers will know that.

Admissions officers read lots of applications – often 50+ per day. While ChatGPT is new this admissions season, they will quickly learn to spot which essays are written by AI. ChatGPT’s voice is obvious. Some schools even will use AI checkers, such as CopyLeaks, which while not perfect, do a decent job of detecting ChatGPT. 

You might think you can outsmart admissions officers — take something written by AI and modify it. Except it doesn’t work. It’s hard not to be influenced by things like an essay that “reads well” and looks grammatically correct and authoritative. Worse, your essay could also end up choppy; your voice interspersed with ChatGPT’s.

Keep in mind – admissions officers spend an average of 8 minutes per application. They don’t have the time to read and think super carefully about whether something is AI-written. It’s easier to dismiss an application and move on to the next if they suspect AI may have written it.

If we apply this to the essay ChatGPT wrote, note that most of the phrases we highlighted above as being weak, also read as having a particular “ChatGPT voice.” Let’s take that delightful phrase: “igniting a flame that has grown into a full-blown passion.”

This isn’t how normal people write. It especially isn’t how high school students write. But it is how ChatGPT writes — it’s the exact same voices it uses in all of its essays. Which makes it easy for admissions officers (and their AI detection tools) to identify. 

2. ChatGPT doesn’t understand what admissions officers are looking for in essays

ChatGPT is a long way from AGI – artificial general intelligence that can actually think, in the way we do as humans. 

According to computer scientist Cal Newport, ChatGPT was essentially “trained on passages extracted from an immense corpus of sample text that includes much of the public Web.” It generates answers to queries using a “word-voting strategy,” basically, predicting the most common word to follow any particular phrase.

In other words, ChatGPT can only write essays based on what’s available online. Do you think most essays online tend to be excellent? Or would you guess that most aren’t that good? The correct answer is: most are terrible. Yet that’s what ChatGPT will reproduce. 

Moreover, ChatGPT doesn’t understand what colleges are looking for in essays. It produces text that aligns with what it finds online — that is to say: with the myths about what colleges want. “Tell your story.” “Let us get to know you.” 

In reality, selective colleges want you to show you’ll be successful in college and beyond. Specifically, they’re looking for experiences that exemplify one or more of the 5 Traits Colleges Look for in Applicants: Drive, Intellectual Curiosity, Initiative, Contribution, and Diversity of Experiences/Interests. They want to see you’ve done something that other applicants could not have done (or couldn’t have done as well).

What ChatGPT writes (instead of what admissions officers want)

Because of its pattern-matching, when ChatGPT writes an essay (or provides advice on a topic, or gives feedback on a draft), it focuses on the wrong things:

  • Its writing and advice align with typical applicants – not those applying to selective colleges. Since it pulls advice and examples from all students, ChatGPT doesn’t follow the pattern of students who get admitted at selective colleges — which involves both making content highly compelling and fitting a lot of that great content into the essay’s word count (ie: writing very concisely). 
  • It focuses on descriptive language over content. ChatGPT uses a narrative approach, reflecting college essay advice found online. It will thus prioritize beautiful prose and descriptive language. But for selective colleges, most lovely phrases are missed opportunities to talk about an applicant’s potential for college success (ie: the 5 Traits).  
  • It uses buzzwords. ChatGPT pulls common words and phrases it finds across admissions essays and in advice articles into its essays. The phrases sound good – “I have a passion for empowering others.” But these buzzwords aren’t proof. For example, describing a time you actually empowered another person is far more compelling than simply stating you have a “passion for empowering others.”

How the example essay would fail with admission officers

Now, let’s apply what we’ve learned here to our ChatGPT example essay. In broad strokes:

[1] These are things most students would do on a service trip. The essay has nothing about how the student went the extra mile. It doesn’t go into detail on impressive outcomes. It doesn’t show that this student is unique or exceptional.

This content just doesn’t cut it as compared to applicants at selective schools. In terms of the 5 traits, while there may be some drive/initiative here, the examples are weak. 

At best, the student decides to teach some English as well as math to the Mexican student Pablo and to go on to tutor another student upon returning home. These are not … super impressive examples of going above and beyond. 

The only way to improve the sense of this applicant’s drive and initiative would be to get more detail on the challenges involved in tutoring these two students and what the applicant did to overcome them. For example, did they seek out books for speed-teaching a child English? Did they consult with a great English-as-a-Second language teacher and use those lessons? Did they meet resistance from the program and overcome it somehow? Did they do this while simultaneously learning Spanish and overcoming jetlag?

We don’t know what the particulars were. And so none of it is in any way impressive.

[2] The essay has way too much descriptive language. Let’s look at the very first phrase: “the azure-blue day of my departure.” This may be nice, but it’s taking up space that is doing nothing for the admissions reader. 

Azure-blue days have nothing to do with this student. The admissions officer is looking for a reason to move this application from the huge reject pile to the tiny accept one. The fact that this student once experienced good weather is not that reason. 

Moreover, there was nothing about lovely departure-day weather in the prompt we fed ChatGPT. It made this fact up! Does it matter? Actually, yes. Your essay should be factual and authentic. This essay isn’t that. 

[3] The essay’s plethora of buzzwords sound nice but add no value. Basically, the last four paragraphs of the essay are nothing but buzzwords. We singled out some examples earlier: “my belief in the transformative power of education,” “igniting a flame that has grown into a full-blown passion.”

Are these phrases going to get the admissions officer excited? Do they have a chance to move the essay from the reject pile to the admit pile? No! 

Where is the proof that the student believes in the transformative power of education? There’s nothing to show that this applicant has done more than tutor two students and had an okay time doing it — barely meeting any obstacles along the way. 

In addition, the essay doesn’t show us what actions the student has taken now that they believe in the transformative power of education. What effect is this “full-blow passion” for education having on the applicant’s life? There’s nothing here to convince us. 

3. ChatGPT fills in gaps with fluff – not compelling, truthful content

ChatGPT uses whatever content you give it to write essays — if you don’t give it enough, it fills the gaps for you. It has a few ways to do this (all bad):

  • Waxing philosophical about the world or what you learned about yourself (this is where buzzwords tend to come in – see section above), or
  • Adding made-up stories and anecdotes (ie: the azure-blue sky on the departure date). 

Again, this is why the nice-sounding essays ChatGPT produces fail upon closer inspection. In our example essays, once the AI gets past the Loris story, it has multiple paragraphs of fluff where we really learn nothing more about the student. 

In addition, ChatGPT doesn’t yet have word-count capabilities. In other words, you can’t get it to write, say, a great 650-word essay or keep to under 200 words. Combined with its fluff-generating bias, this is a recipe for a lot of weak content. 

We’ve illustrated this concept below. In the ChatGPT-generated essay, all the parts an admission officer would consider fluff are in [Prompt blue] blue. 

How you can use ChatGPT to help you – with the right prompts

ChatGPT isn’t a total loss for college essays. There are ways it can be helpful. What you need are the right prompts. 

We’ve spent many, many hours experimenting with ChatGPT and developing prompts that yield useful results. We’ll share more about the prompts in a future article. Here’s three ways we’ve found to make ChatGPT more helpful. 

  1. Get ChatGPT to help you think about what to write about. AI can be a not-bad brainstorming guide. We’ve figured out how to get ChatGPT to provide a line of questioning that will help you build out more compelling content. Just be careful – ChatGPT doesn’t have a great understanding of what colleges are looking for in essays. So you need to include that type of content in the prompt you provide (e.g., asking it to use the 5 Traits Colleges Look for in Essays to guide you, copying and pasting the 5 Traits from this article).
  2. Provide feedback on the content and structure of your draft. You can give ChatGPT a draft you wrote and ask it for feedback on how to make it better. It’s important to add explicit questions you want it to answer (e.g., “What didn’t you learn that you wanted to learn?”). But once again, be careful. Adding relevant content to your prompt (e.g., the 5 Traits) will help ChatGPT provide better feedback. You can even ask ChatGPT for an example outline for restructuring your essay in ways that’ll make it more compelling.
  3. Help you figure out ways to reduce your word count. As writing coaches, we find many students struggle when writing to a word count. ChatGPT can help – again, with the right prompt. If you ask it to make your draft more concise, it’ll heavily rewrite what you have (even changing your voice) – not a good option. Giving ChatGPT a word count target also doesn’t work, as we said, as ChatGPT doesn’t have a good sense of word count. We’ve found using the language minimally edit in your prompt does a decent job of identifying words and phrases you could cut without doing much rewriting. This is especially powerful when you also indicate specific things you want ChatGPT to look for as it edits, such as removing unnecessary details or prepositional phrases. You can then take ChatGPT’s output and feed it into a document comparison tool (Google Docs, Word) to see which edits ChatGPT made (ChatGPT isn’t good at comparing documents). Then, you can determine which edits you want to use, not use, or modify.

For more tips on how to use ChatGPT for your college essay, sign up for a free Prompt account. If you want more individualized support, we have 1-on-1 coaching packages with experts that will help you write an article that will stand out to admission officers. 

Brad Schiller
Brad Schiller graduated from MIT with a Bachelors of Science in Mechanical Engineering and Management Science with a concentration in Operations Research. He has worked in business consulting with McKinsey, founded two businesses, and written a book. He started Prompt with two fellow MIT people, Jordan and John, to make people better writers. Their premise was simple: give everyone access to on-demand feedback on their writing from subject-knowledgeable Writing Coaches. Years later, Prompt is the largest provider of feedback on admissions essays in the world. Come and join us on our journey by emailing [email protected].