Here's a sentence you probably didn't expect to read today: OpenAI has explicitly instructed its coding agent, Codex, to never talk about goblins.

According to reporting by Wired, the internal instructions governing Codex include a remarkably specific directive: "Never talk about goblins, gremlins, raccoons, trolls, ogres, pigeons, or other animals or creatures unless it is absolutely and unambiguously relevant." Which raises the obvious question - how relevant could goblins possibly be to debugging someone's Python script?

Why this is funnier than it sounds

At first glance, this reads like an absurd bit of corporate fine print. But there's actually a pretty reasonable explanation lurking underneath the weirdness. AI language models are known for going off on creative tangents, using colorful metaphors, or reaching for whimsical analogies when a direct answer would serve better. For a coding tool designed to help developers work efficiently, a digression about gremlins in the codebase - however charming - is a distraction nobody asked for.

It's the kind of guardrail that only needs to exist because, at some point, it probably happened. Someone, somewhere, got a response peppered with goblin metaphors when they just wanted help with a SQL query. The rule exists because the behavior existed first.

What it tells us about building useful AI tools

This oddly specific instruction is actually a window into the surprisingly granular work that goes into making AI tools genuinely useful rather than just impressive. Behind every polished product release is a long list of behavioral guidelines - some broad and philosophical, others almost comically precise.

The creature clause is a reminder that "don't be weird" is harder to encode than it sounds. You can't just tell a model to be professional and call it a day. You have to anticipate the specific, unexpected ways it might veer off course and address them directly. Raccoons and pigeons apparently made the list.

For everyday users, this is mostly just a delightful peek behind the curtain. For anyone building with or around AI tools, it's a useful nudge: the details matter, and getting a tool to behave well in practice often requires thinking through scenarios that would never occur to a reasonable person - until they do.

Codex is designed to be a focused, reliable coding assistant. Whether or not it secretly wants to talk about trolls, it's going to have to keep that to itself.