Stop asking me how to prompt ChatGPT: Reflections from a PhD in AI
If your biggest concern is prompting better, you’ve already missed the point. If you're only putting blame on individual users, think bigger.
Whenever someone finds out I have a PhD in AI sociology, the reaction is almost comically predictable: “Oh! How do I prompt better?”
Then comes the cascade of hype—
“Have you tried custom GPTs?”
“Did you see the shift from multimodal to omnimodal?”
Or at the very least: “Can you make it write like Taylor Swift meets McKinsey?”
It’s not just sad. It’s insulting.
Because this is what expertise in AI has been reduced to: how well you can phrase a request to make a machine perform. As if real knowledge is about tweaking inputs, not questioning impact. As if the revolution is about tips and tricks, not systems, power, ethics, and harm. I feel ashamed—not of my work, but of what people expect it to mean.
Yes, AI is advancing—but mostly, it’s advancing capitalism.
Over the past five years, we’ve seen wave after wave of so-called breakthroughs: from GenAI to agentic AI, each rollout wrapped in sleek UX and sold as a leap forward. But peel back the press releases and you’ll find the same old engine: productivity, speed, convenience. Capitalism with a neural net.
Corporate marketing and tech media have sold us a dream: a future where you can do more, faster, and with less effort. Now you can code, design, write novels, contracts, essays—entire lives—in minutes.
And the most seductive part is that it’s all framed as “democratisation.” Skills and talents, suddenly accessible to all. It sounds noble. Like justice dressed in code.
And I’m not saying it’s entirely false. I believe in the potential. But we need to ask: Who’s crafting this narrative? Who’s repeating it the loudest? Is it everyday users justifying their addiction to tools they can’t unlearn? Or product teams convincing themselves they’re liberating the world—while cashing in on its exhaustion?
Are these systems really designed to close gaps? To uplift the marginalised? To redistribute power? To alleviate poverty? To save the planet?
Please. Spare me the pitch deck.
My expertise isn’t Prompt Engineering—it’s society.
When people introduce me as an AI expert, it’s not about how to coax better outputs from a chatbot, but how these systems reshape society. Not better prompts, but deeper patterns. Not time-saving hacks, but human costs. Cultural costs. Social costs.
You might assume I’m the first to jump on the hype train. I try the tools, yes. But I’m also the first to step back and ask: What is this doing to us? How does it change how we see ourselves, each other, and the world around us?
Let me tell you what research has shown so far.
In hiring, AI systems penalise women more than men. Résumé screeners built on historical data reward masculine-coded language and penalise women for equivalent experience. Men who take time off to be caregivers—are punished even more harshly, reinforcing the notion that caregiving is incompatible with ambition. Women exit the workforce not by choice, but because the systems—automated or not—push them out.
In dating, AI doesn’t just reflect biases—it amplifies them. Black women and Asian men receive the fewest matches across platforms. Older women are penalised more than older men. Non-binary users are forced into binary categories. Fem-presenting profiles are hypersexualised, while masc-presenting ones are rewarded for “stability.” The algorithm doesn’t just sort—it judges.
At work, AI doesn’t improve conditions—it intensifies them. Burnout is scaled under the guise of optimisation. Labour is no longer valued for thoughtfulness, only for throughput. And in creativity, the theft is subtle, but pervasive. Artists and writers are scraped, mimicked, and repackaged without consent. When OpenAI models mimic Miyazaki’s style, they don’t honor it—they flatten it. The philosophy, the brushstroke, the cultural context—all erased. What we call “creative augmentation” is often just cultural erasure.
And there’s more behind the scene: invisible labour. The Kenyan data workers labelling toxic content for less than $2 an hour. The energy-hungry servers sucking up water in drought-prone regions. The carbon footprint of training a single model exceeds that of five cars in their entire lifetime. These are the real costs behind the frictionless interfaces.
This isn’t just about the users—it’s about the system that demands it. When productivity becomes survival, there’s no room to question.
Let me be clear: I’m not against AI. I believe AI can be used for good. But not like this. Not in a world where the loudest voices are shielded from its harms. Where the biggest companies profit from “open” models while quietly consolidating power. Where ethics becomes a checkbox, not a compass. Where entire industries are redesigned for output, not outcomes.
And no, this isn’t some sort of a metaphor: “Just like any other tool, a knife can be used to cook good food or kill.”. Because the harm isn’t just in how we use it. It’s also in why we’re forced to.
Yes, some users don’t stop to think. But let’s be clear—it’s not always out of carelessness. For many, it’s because they can’t afford to. They’ve been pulled into the churn—pressured to produce more, optimise more, perform more.
They’re not the empowered. They’re the exploited.
These aren’t the early adopters tech companies parade around in keynote slides. They’re the invisible backbone of the system—overworked, overwhelmed, and clinging to shortcuts just to survive. Because when you’re racing the clock just to make rent, “efficiency” isn’t a choice. It’s a lifeline.
And then there are the creators. Caught in a loop they never consented to. Using GenAI to predict trends. To make content. To outdo the content GenAI made possible. It’s not creativity—it’s crisis in disguise. A game with no finish line. Who, exactly, are we racing against?
This isn’t just about personal choices. That’s why I never blame individuals for overreliance on AI. Agency doesn’t exist in a vacuum. It’s shaped by structures. And those structures are systemic, corporate, and designed.
Unfortunately, AI wasn’t built for those with time to think. It was built to serve those who don’t. And the people most dependent on it are not the elite—they’re the overworked, the under-resourced, the chronically stretched. Being able to opt out of the AI grind is a privilege. More time means you can think. More money means you can delegate. Freedom from the system is, itself, a luxury.
And that’s why this isn’t an individual problem—it’s a collective one.
Next time you meet me, don’t ask me about prompts, ask me this: “How might we take back power within the agency we still own?”
I’m not here to make you better at using AI. I’m here to ask why we keep using it without asking hard questions.
If you take one thing away from this, let it be this: In the age of AI, don’t just ask what you can do with it. Ask what you’re willing to lose for it. And who you’re becoming when you stop questioning it.
As someone knee-deep in the field, here are five things I urge you to reflect on:
Use ≠ Wisdom
Just because something is new—or popular—doesn’t mean it’s necessary. Ask: What problem is this solving? Who decided that’s the problem?
Convenience has a cost
Every shortcut is built on someone else’s time, energy, dignity, or land. If it feels too easy, ask who’s paying for that ease.
Access ≠ Equity
Just because more people can use the tools doesn’t mean power is being shared. Ownership still matters. So does intention. So do the rules that get baked into the system.
Output ≠ Impact
Short-term results often hide long-term consequences. Just because something “works” now doesn’t mean it builds a better future. True impact considers not only what’s produced, but what it produces in return.
Before you praise, ask who profits
If AI makes your life easier—at whose expense? And if it doesn’t serve you—then what, or who, is it really serving?
Literacy isn’t about keeping up with updates. It’s about holding space for discomfort. It’s about refusing to mistake speed for wisdom. And above all, it’s about choosing agency over automation.
Cover art: Venus with a Mirror, Titian, c. 1555. Retrieved from the National Gallery of Art Open Access.