Let's try an experiment.
I want you to tell me your favorite meal you've ever had. Take a moment to think about it.
What happened in your mind just now? You likely scanned through memories of memorable meals—your personal "training dataset," if you will—weighing them against each other until you settled on one that stood out. Perhaps it was that perfect pasta in Italy, a home-cooked holiday dinner, or street food that surprised you with its complexity. And just like AI models, everyone's training datasets differ. If you're like me and rarely eat out, your search might be quicker but more limited in scope. Whatever you chose, what you just performed was essentially a search operation—retrieving and evaluating memories until you found the optimal result.
Now for the second part: Come up with a creative name for a restaurant.
Again, what happened in your mind? You might have combined words in interesting ways, played with food-related puns, or drawn inspiration from locations, cuisines, or concepts you admire. This feels more like creation—after all, you've produced something new that didn't exist before. But look closer at the process. Weren't you still searching through your mental database of restaurant names, food terms, and linguistic patterns? Weren't you retrieving and recombining elements you've encountered before?
One more scenario: Imagine you're dining at an alien restaurant on a spaceship. What would it look like and what would be on the menu?
This pushes us further into what feels like pure creation. Yet even as you envision floating tables, exotic ingredients, or gravity-defying serving methods, you're still drawing from a lifetime of accumulated concepts—science fiction you've consumed, physics you understand, and Earth restaurants you're familiar with. Your alien restaurant, however imaginative, is built from transformed and recombined elements you've searched for in your mental repository.
This raises a question: Is there a difference between searching and creating? Or is what we call "creation" actually a form of search followed by analysis? And what does this have to do with AI?
I was having a discussion with someone where they were skeptical that AI could produce anything original. AI is like "auto-complete" they said. A pattern repeater. What AI can do is just regurgitate it's training data set in new combinations. Creativity remains outside the scope of these mathematical algorithms.
It made me wonder if what we mean by creativity is actually any different.
Everything is a Remix
"Creativity is just connecting things. When you ask creative people how they did something, they feel a little guilty because they didn't really do it, they just saw something. It seemed obvious to them after a while."
— Steve Jobs
Jobs understood something about creativity that many of us struggle to accept: true originality might be an illusion. What we call "creation" is often the art of making novel connections between existing ideas.
This concept is explored in Kirby Ferguson's documentary series Everything Is A Remix. Ferguson demonstrates how creativity across all domains—music, film, literature, technology—follows a consistent pattern: copy, transform, and combine. The Beatles borrowed heavily from American R&B. Apple's revolutionary graphical interface built upon innovations from Xerox PARC. Even Star Wars, that monument to imagination, is a deliberate pastiche of Flash Gordon serials, Kurosawa films, and Joseph Campbell's hero's journey.
Consider how creative professionals actually work. They don't conjure ideas from nothing—they use systematic approaches to transform existing concepts. The SCAMPER technique, taught in design and innovation courses worldwide, codifies this process:
Substitute (Replace one element with another)
Combine (Merge existing ideas)
Adjust (Tweak parameters)
Modify/Magnify/Minify (Change scale or emphasis)
Put to Other Uses (Find new applications)
Eliminate (Remove components)
Reverse/Rearrange (Change order or orientation)
This isn't a formula for "cheating" at creativity—it's how creativity actually works. When a chef creates a "new" dish by substituting one ingredient for another or a musician combines two different genres, they're not creating ex nihilo. They're searching through possibility space and identifying promising combinations.
Look at any creative work closely enough, and you'll find its influences. Pulp Fiction revolutionized cinema while deliberately sampling from dozens of earlier films. House of Cards adapted a British series which itself drew from Shakespearean tragedy. Harry Potter combines elements from boarding school stories, hero's journeys, and earlier magical worlds.
If human creativity is fundamentally recombinatorial—if we're all just searching through the space of possible combinations—then perhaps the distinction between "search" and "create" is more semantic than substantive. And if that's true, then the skepticism about AI's creative potential deserves reconsideration.
But this leads us to an even more mind-bending possibility: if all possible combinations already exist in theory, then perhaps everything that could ever be written, composed, or designed already exists in some abstract sense. We're not creating new works—we're discovering particular volumes in an infinite library of possibilities...
The Library of Babel: Where Everything Already Exists
This idea—that we're discovering rather than inventing—was hauntingly captured by Jorge Luis Borges in his 1941 short story "The Library of Babel." Borges, an Argentine writer known for his labyrinthine fictions exploring infinity and reality, imagined a universe in the form of a vast library containing all possible 410-page books with every possible combination of characters.
In this library exists every book that has ever been written and every book that will ever be written. It contains this article, and all possible variations of it—including versions with one typo, two typos, or completely different conclusions. It contains your biography, including accurate and inaccurate versions. It holds books of pure gibberish and books of profound wisdom not yet discovered.
The catch? The overwhelming majority of the library consists of complete nonsense. Finding a single coherent page would be nearly impossible amid the astronomical number of meaningless volumes.
What Borges created as a philosophical thought experiment has been given digital form by Jonathan Basile, who created libraryofbabel.info. This website algorithmically generates every possible combination of characters within certain constraints, allowing you to "search" for any text. Type in a paragraph, and the site will tell you the precise "coordinates" where that text has always existed in the library.
This isn't just a literary curiosity—it's a metaphor for creativity itself. When Shakespeare wrote Hamlet, perhaps he wasn't creating something new but finding a particular volume in this theoretical library. His genius wasn't in making something from nothing, but in his ability to navigate the vast space of possible combinations and recognize which ones had meaning and value.
The same applies to any creative act. A composer isn't inventing new notes but discovering effective arrangements of existing ones. A chef isn't creating new flavors but finding pleasing combinations of ingredients. They're all searching—with varying degrees of skill and intuition—through a space of possibilities that theoretically already exists.
Turns out this is how language models generate. By navigating a probability space to find likely sequences of tokens. It's searching through a statistical approximation of the Library of Babel, using patterns it observed in its training data as a guide.
The Mathematics of Meaning: How Words Become Numbers
While Borges' Library of Babel gives us a philosophical framework for understanding creativity as search, modern AI systems implement this concept in a surprisingly elegant way: by turning words into numbers.
The first time I encountered this idea—that the meaning of words could be encoded mathematically—I was filled with a sense of wonder that hasn't diminished with time. There's something profoundly beautiful about the notion that the richness of human language, with all its nuance and poetry, can be represented through numerical relationships. It's as if mathematics itself is the hidden language beneath all our words.
When you read the word "cat," your mind conjures images, associations, and memories—perhaps your childhood pet or a cartoon character. But computers can't directly process meaning this way. Instead, language models represent words as long lists of numbers called vectors.
As Sean Trott and Timothy Lee explain in their excellent article "Large language models, explained", here's one way to represent the word "cat" as a vector:
[0.0074, 0.0030, -0.0105, 0.0742, 0.0765, -0.0011, 0.0265, ...]
At first glance, this seems like a bizarre and reductive way to represent language. How could these seemingly random numbers possibly capture the essence of "cat-ness"? But this mathematical representation enables something remarkable: it allows computers to reason about meaning through geometry.
Imagine a vast multidimensional space where each word occupies a specific point. In this space, similar words cluster together. The vector for "cat" sits near "kitten," "dog," and "pet." The vector for "Paris" is close to "France," just as "Berlin" is close to "Germany." This isn't just organizational convenience—it's a map of meaning itself.
What's truly astonishing is that these vectors capture not just similarities but relationships. In 2013, Google researchers discovered that vector arithmetic could perform a kind of mathematical reasoning. For example:
"King" - "Man" + "Woman" = "Queen"
The vector mathematics had somehow captured the concept of gender roles. Other relationships emerged as well:
"Switzerland" is to "Swiss" as "Cambodia" is to "Cambodian"
"Paris" is to "France" as "Rome" is to "Italy"
"Big" is to "Biggest" as "Small" is to "Smallest"
The vectors aren't random; they're derived from analyzing millions of texts to see which words appear in similar contexts. Words that often appear near each other end up with similar vector representations.
When a language model generates text, it's navigating this vector space—searching for the most probable next word given the sequence so far. It's not so different from how you might complete the sequence 2, 4, 6, 8... with 10, recognizing the pattern of adding 2 each time.
But language models operate in spaces with hundreds or thousands of dimensions, allowing them to capture subtle relationships that would be impossible to visualize. GPT-3, for instance, uses vectors with 12,288 dimensions for each word—that's 12,288 numbers encoding different aspects of meaning.
This mathematical framework transforms the philosophical concept of the Library of Babel into something concrete and computable. Shakespeare wasn't randomly assembling letters to write Hamlet—he was navigating a probability space guided by his understanding of language, character, and human nature. Modern AI systems do something similar, using vector mathematics to find paths through the space of possible word combinations.
So when we ask an AI to write a story about a brave knight rescuing a dragon from an evil princess (subverting the usual trope), it's not creating from nothing—it's searching through vector space for word sequences that satisfy these constraints while maintaining coherence. The AI has learned which paths through this space tend to produce meaningful text, just as human writers have learned which combinations of words resonate with readers.
The Search That Feels Like Creation
So, was my skeptical friend right? Are AI systems just glorified autocomplete? Yes—but that's not the limitation they thought it was.
When we began our experiment, I asked you to recall a meal, name a restaurant, and imagine alien dining. Each task felt progressively more "creative," yet each involved searching through mental repositories of experiences and patterns. The difference wasn't in the mechanism but in the complexity of the search.
Language models operate the same way. Whether completing "What would you like for lunch?" with "a sandwich" or finishing a murder mystery with "the butler did it, hiding the weapon in the grandfather clock all along, having planted false evidence to frame the countess whose inheritance he coveted for decades"—the process is identical. Both are searching a probability space for the most fitting continuation based on observed patterns.
The murder mystery example is particularly revealing. To generate a satisfying conclusion, the model must track characters, motives, and clues established throughout the narrative. It must understand the conventions of the genre—the red herrings, the dramatic reveal, the moment of realization. It must maintain narrative coherence while delivering emotional impact. All of this is "just autocomplete," but it's autocomplete of staggering complexity and nuance.
There's something almost disappointing in this realization. We like to believe our creative moments are magical—transcending mechanical processes. Learning that creativity might be sophisticated pattern recognition can feel reductive, like learning how a magic trick works.
But understanding creativity as search doesn't diminish its wonder—it reveals its true complexity. When a poet finds the perfect metaphor or a novelist crafts an unforgettable ending, they're performing a search through possibility space, guided by judgment, emotion, and experience.
The distinction between search and creation turns out to be largely illusory. Creation is search—a beautiful, complex search through the Library of Babel for volumes worth reading. We're all searching, all the time, for the next word, the next idea that resonates.
And in that search lies all the magic of human creativity.