top of page


The Caste Bias Problem in OpenAI’s Models
AI isn’t neutral—it learns our prejudices and repeats them as if they’re facts. When ChatGPT “corrects” an Indian surname to a dominant-caste one, that’s not help—it’s erasure. Caste bias in AI hides in defaults, names, and silence. If these systems scale with unchallenged assumptions, they won’t just mirror inequality—they’ll automate it with confidence and speed.

Vishwanath Akuthota
6 days ago5 min read
bottom of page