When Egyptian coder Assem Sabry went looking for an AI model that actually understood his culture, he came up empty. The AI industry in Egypt, as he told Fast Company, simply doesn't exist. So he did what any determined builder would do: he made his own.

The result is Horus - named after the ancient Egyptian god of the sky - a model trained specifically with Egyptian culture and context in mind. Sabry's goal was straightforward but significant: stop depending on American or Chinese models and start asking what AI built for Egypt could actually look like.

Why this matters more than you might think

It's easy to assume that AI tools are neutral, universal things. Type something in, get something useful back. But language and culture are deeply intertwined, and the dominant AI models reflect the cultures that built them. That means speakers of less-represented languages, and people from communities outside the Western tech mainstream, often get a worse experience - or worse, an experience that subtly misrepresents or ignores their world.

Sabry trained Horus using GPUs from Google Colab, working with the resources available to him rather than waiting for the industry to catch up. It's a grassroots approach to a problem that the biggest tech companies have largely overlooked.

A growing global movement

Sabry isn't alone. According to Fast Company's reporting, there's a broader push happening around the world to build AI that reflects local languages and lived experiences, rather than defaulting to a handful of dominant tech cultures. Developers and researchers in various regions are recognizing that if they don't build tools for their communities, nobody else will.

This isn't just a feel-good story about representation - though that matters too. It's about access. When AI tools don't work well in your language or cultural context, you're cut off from technology that is increasingly shaping how people work, learn, and create. That gap compounds existing inequalities fast.

What comes next

Projects like Horus are still early, and building a competitive AI model without the resources of a major tech company is genuinely hard. But the motivation is real, and the need is clear. As AI becomes more embedded in daily life globally, the question of who gets to shape these tools - and whose cultures they reflect - is only going to get more urgent.

For Sabry, it starts with a simple idea: people deserve technology that actually sees them.