Summary
As universities begin accepting AI-generated portfolios — visual projects, design summaries, even essays crafted with the help of large language models — the line between creativity and automation is blurring. But what does this mean for students whose stories don’t fit patterns? Through the fiercely independent voice of Arundhati Roy, we examine what’s being lost: truth, resistance, and lived experience. Because if education becomes a beauty contest for polished machines, the poor, the poetic, and the political will be pushed out.
“There’s really no such thing as the ‘voiceless’. There are only the deliberately silenced, or the preferably unheard.”
— Arundhati Roy
Today, the “voiceless” may be replaced by something even more insidious: the algorithmically optimized.
As elite universities race to modernize their admissions criteria, a new trend is quietly taking over: the AI-generated portfolio. What was once a collage of personal growth, raw sketches, unfiltered voice recordings, or reflection journals is now being replaced by clean, gliding, pre-engineered visual experiences — courtesy of tools like Midjourney, ChatGPT, and Sora.
Some admissions officers are calling this “the future.”
I call it a betrayal.
The Rise of AI Portfolios
Let’s first unpack what they are.
An AI-powered admissions portfolio typically includes:
- Essays drafted or rewritten with ChatGPT
- Art created using generative tools like DALL·E or Midjourney
- Project writeups stylized with GPT-4 for clarity and tone
- Code, resumes, and video content enhanced by AI productivity tools
Students are using AI tools to:
- “Elevate” their ideas into slick formats
- “Polish” imperfections out of their self-expression
- “Optimize” their emotional tone
Sounds helpful? Maybe — until you realize that what’s being optimized is authenticity.
Where AI Portfolios Are Gaining Ground
- Design schools accepting video essays — some AI-voiced
- Policy programs encouraging multimedia storytelling
- Business schools valuing “presentation-ready” impact
- Ivy League undergrad programs piloting AI-evaluated visuals
The push is subtle — but dangerous. Students are being nudged to replace honesty with fluency, and vulnerability with production value.
Why Roy Would Rage
Imagine Arundhati Roy as an applicant today.
Her statement of purpose would include:
- Political rage
- Cultural resistance
- Radical prose
- Unpopular truths
AI would flag it as:
- “Over-emotional”
- “Unstructured”
- “Not culturally aligned with brand tone”
But that’s the point. Great voices don’t align.
They disrupt.
And AI portfolios reward obedience, not defiance.
What’s Being Lost
Let’s be clear. This is not just about tech. This is about class, power, and exclusion.
- Marginalized students can’t afford access to top-tier AI editors
Many still write on second-hand phones, in patchy networks, without Grammarly Premium. - Non-native English speakers are filtered out
Not because they lack depth — but because they “sound less American.” - Creative minds are being flattened
The raw voice of a tribal girl writing about a broken well is not “submission ready.” - Those who resist AI are at a disadvantage
It’s the quiet beginning of a two-tier system — where robots get you in and humans are left behind.
Aesthetic ≠ Intelligence
Let’s say this again: Just because something looks good, doesn’t mean it is honest, meaningful, or real.
AI portfolios are starting to favor:
- High gloss
- Narrative arcs engineered by prompt
- Perfect grammar over emotional truth
But what if you’re angry? Or confused? Or healing?
What if your story isn’t linear?
Those stories matter more — because they’re alive.
Real Voices, Silenced by Smoothness
“My first SOP draft was messy. My mentor helped me clarify — not replace it. That’s why it still sounds like me.”
— Nivedita, Cambridge admit
“The AI version of my essay made me sound like a robot. My counselor said, ‘This is not your voice. Throw it out.’”
— Uche, UChicago
“I refused to use GPT. I got waitlisted. They said my essay lacked polish.”
— Anonymous, Ivy applicant
This isn’t just anecdotal. It’s systemic drift — toward sameness.
Why AI Portfolios Undermine Fair Admissions
Let’s list it clearly.
Problem | Why It Matters |
Access gap | Not all students have the same tech, internet, or tools |
Template thinking | AI pushes similar phrasings and layouts |
Cultural whitewashing | Erases regional emotion and oral traditions |
Risk-aversion | AI makes applicants sound “safe” — and boring |
Overproduction | Rewards applicants with more tech privilege, not more insight |
This is not evolution. This is erasure disguised as efficiency.
What WePegasus Stands For
We don’t believe in making you sound like a TED Talk.
We believe in helping you uncover the best version of your raw, real, human story — and edit it, don’t erase it.
At WePegasus:
- We teach students to write from the heart first
- We help them craft clarity without killing voice
- We never promote AI-generated visuals, essays, or video resumes
- We stand for real authorship
Because stories should not be filtered.
They should be fought for.
Roy Would Say: Don’t Sound Right. Sound True.
Arundhati Roy’s books weren’t written to fit boxes.
They broke them.
She didn’t “neutralize her tone” for readability. She made you uncomfortable, aware, awake.
That’s the energy we must protect.
Let admissions officers be challenged. Let them sit with messy, powerful, political, emotional words.
Because true selection begins with respect — not automation.
Closing Thoughts
AI portfolios are not neutral tools.
They are gatekeepers of aesthetic privilege — favoring the most “fluent,” not the most deserving.
They punish emotion.
They erase context.
They reward polish over pain.
And in doing so, they betray the very point of higher education: to hear the world, not mute it.
So no, we don’t promote AI portfolios.
We don’t automate growth.
We don’t outsource voice.
We stand for writing that breathes.
For essays that bleed.
For truth that breaks format.
Because the future doesn’t belong to the most polished.
It belongs to the most human.
Glossary
- AI Portfolio – An admissions project heavily aided by AI tools like ChatGPT, Midjourney, etc.
- Generative AI – Tech that creates original-sounding content from prompts
- Narrative Arc – A storytelling structure with a beginning, conflict, and resolution
- Cultural Whitewashing – Stripping away local expression to match elite norms
- Voice – The authentic tone, language, and rhythm of an applicant’s story