A breakthrough for scientists after they had a 20-minute conversation with a humpback whale.
An animal bigger than a school bus with perhaps the most sophisticated form of communication that has ever existed.
A researcher at MIT stares at her screen and whispers four words out loud.
This isn’t supposed to be possible.
The AI has just finished processing 9,000 sperm whale recordings.
And what it found isn’t random animal noise.

It’s a phonetic alphabet complete with vowels that look structurally identical to the building blocks of human speech.
>> [snorts] >> Not similar, identical.
And nobody in that room is able to explain how that is possible.
The assumption that collapsed.
For decades, the scientific consensus on sperm whale communication was settled.
They used roughly 21 distinct click patterns called codas.
Each one carried a fixed simple meaning.
Codas are groups of clicks.
So, what we found is that in a sense, their codas, these groups of clicks, are very similar to human vowels.
>> A warning, a direction, a contact call.
Researchers treated whale communication the way you would treat a traffic light.
A handful of states, a fixed set of meanings, nothing approaching complexity.
We have looked at sample data from sperm whale communication which consists of many, many sequences of individual alphabet pieces called codas.
That assumption was not just wrong.
It was one of the most spectacular miscalculations in the history of modern biology.
The story of how it collapsed starts with a marine biologist sitting in his office with headphones on doing something completely unrelated.
David Gruber was at Harvard University’s Radcliffe Institute reviewing sperm whale recordings he had collected while studying jellyfish in the Caribbean.
The sounds were what they always were, rapid-fire bursts of mechanical clicking like Morse code played underwater.
He had heard them a hundred times, background noise from an animal he assumed science had already figured out.
MIT cryptographer Shafi Goldwasser was walking past his open door.
She stopped, listened, then said something that changed the entire direction of the field.
That sounds like Morse code.
Have you tried using machine learning to decode it? Gruber hadn’t.
Nobody had.
Goldwasser spent her career cracking actual encrypted communication systems, the kind governments use to hide state secrets.
Her instinct here wasn’t casual curiosity.
It was professional pattern recognition.
To an expert in hidden structure, those whale clicks were not noise.
They were a code that had never been run through a decoder.
That conversation led to a third person, Michael Bronstein, an artificial intelligence researcher who explained that the same deep learning algorithms powering large language models like ChatGPT could, in theory, be applied to animal communication as long as researchers could gather enough training data.
The next 5 years, our team of AI specialists, roboticists, linguists, and marine biologists aim to use the most cutting-edge technologies to make contact with another species.
More recordings than any single scientist had ever collected.
More behavioral context than any existing whale study had attempted.
More compute than any marine biology project had ever used.
In 2020, that question became Project CETI, the Cetacean Translation Initiative.
Funded by the Ted Audacious Project with partnerships from MIT, Harvard, UC Berkeley, and 15 other institutions, the team built a permanent research station in Dominica where multi-generational sperm whale families have been observed for decades and individual animals are known by name.
The island sits in a stretch of Caribbean water that sperm whale clans return to reliably year after year.
The families are documented, the matriarchs are identified, the social hierarchies have been mapped through decades of patient surface observation.
The team then went further than any previous whale study in instrumentation.
Hydrophone arrays permanently installed along migration routes.
Bio-logger tags attached directly to whale skin via suction cup.
Each carrying three synchronized microphones capable of distinguishing individual voices in a group conversation.
GPS logging.
Depth sensors.
Accelerometers tracking body movement to within fractions of a degree.
The system was engineered to capture not just what the whales said, but who they were saying it to, what they were physically doing at that moment, and what was happening in the acoustic environment around them.
Every variable that could give meaning to sound was being logged simultaneously.
That was the design intention from the start.
Not just to hear the language, but to build the contextual record that might eventually allow it to be understood.
9,000 recordings.
The same class of deep learning AI that powers large language models.
These weren’t the typical harmonious whale songs that I’d been accustomed to.
These sounded more like digital data transfer.
And what came back rewrote the textbooks.
What the AI found.
The AI didn’t find 21 codas, it found 156.
But the number was almost beside the point.
What mattered was what was inside them.
Each coda has internal structure, tempo variations, rhythm shifts, ornamentation clicks layered in recurring patterns.
Those elements aren’t decorative, they’re functional.
They work the way phonemes work in human language, modular units combined according to rules to generate meaning that no single unit carries alone.
The whales were not transmitting 156 preset messages like a fixed menu of signals.
They were constructing communication dynamically from a modular phonetic system.
The same fundamental architecture that underlies every human language ever documented on Earth.
One analogy makes this concrete.
English builds millions of words from 44 basic sounds.
Remove one of those sounds and thousands of words vanish.
Add a rule about how two sounds combine and new words become possible.
Sperm whales appear to operate on exactly the same principle.
A finite acoustic inventory producing a generative system with effectively unlimited expressive range.
The building blocks are clicks.
The grammar is real.
The output is a communication space that researchers are only now beginning to map.
Dr.
Daniela Rus from MIT described the findings to reporters.
Our results show there is much more complexity here than previously believed.
This is challenging the current state of the art of beliefs about the animal world.
That is the most understated sentence in modern science because nothing about what the AI found was a small adjustment to existing knowledge.
It was a structural demolition of the assumption that nothing in the animal kingdom had ever independently evolved anything resembling the architecture of human language.
Except something had, and it had been doing it in the dark a mile underwater for 30 million years.
The finding that stopped the room.
Then a linguist at UC Berkeley ran a different kind of analysis.
Instead of mapping coda patterns at the sequence level, Gaspar Bakus examined the acoustic properties of individual clicks at the most granular resolution available to modern phonetics equipment.
What he found made his field’s foundational assumptions shudder.
The whales were producing vowel sounds, specifically the ah vowel, the a in father, the e vowel, the e in see, and diphthongs, those gliding double vowel combinations like the oy in boy.
We concluded that this spectral pattern, that basically a whole new dimension in their communication system, is meaningful.
The defining acoustic features of human spoken language independently evolved in a creature living in complete darkness under thousands of pounds of water pressure.
Vowels are not a simple category of sound.
They are precise, controlled modulations of frequency through specialized vocal anatomy, the kind that requires an evolved larynx, specific musculature, and the neural capacity to control it with intention.
That combination was considered uniquely human, a product of evolutionary pressures specific to our lineage in our environment over our timeline.
Bakus described the finding plainly.
In the past, researchers thought of whale communication as a kind of Morse code, but this paper shows their calls are more like very, very slow vowels.
This suggests a complexity that approaches human language.
Read that sentence carefully.
A complexity that approaches human language from a species that developed its communication system in a completely different sensory world along a completely independent evolutionary path with no shared ancestor with humans for hundreds of millions of years.
Consider what independent evolution of vowels actually means in biological terms.
The human vocal tract evolved over millions of years of selective pressure toward language.
A descended larynx, specific tongue musculature, precise control of airflow through a resonance chamber shaped for speech.
We evolved toward vowels.
Sperm whales arrived at them from a completely different starting point through echolocation clicks in an acoustic environment where high-frequency precision determines whether you eat or starve.
They did not evolve toward language the way we did.
They evolved toward something that, when analyzed by the most advanced acoustic equipment available, looks functionally identical.
That’s not a parallel, that’s convergent evolution.
Two separate lineages arriving at the same complex solution through entirely different routes.
And when it happens with something as specific as vowels, the only interpretation that makes sense scientifically is that vowel-based communication is not an accident of human anatomy.
It is a feature of intelligence itself.
The idea that language, vowels, phonemes, combinatorial grammar, pragmatic context was exclusively ours was one of the last clean lines separating human cognition from everything else alive on this planet.
The sperm whale alphabet doesn’t challenge that line.
It removes the foundation it was drawn on.
If this finding is hitting you the way it hit the researchers in that room, subscribe because what comes next is where it gets truly unsettling.
A conversation a mile down.
The whales don’t just possess the structural components in isolation.
They use them contextually.
And that distinction matters enormously.
I became interested in sperm whales when I heard their sounds.
They sounded like they were coming from another universe.
A siren song being broadcast from the darkest reaches of the sea.
The same individual whale produces different codas depending on who it is addressing, what the other whale just said, and what is happening in the immediate environment.
That is not a signal system with fixed inputs and fixed outputs.
That is pragmatic language use.
The ability to deploy communication responsively in real time adjusted to a shifting social situation.
The same flexibility humans show when they speak differently to a child than to a colleague or soften a message depending on who is listening.
Shane Gero has spent 13 years alongside sperm whale families in Dominica.
He describes watching these interactions unfold from the surface.
It is hard not to see cousins playing while chatting.
To not see a mother hand her calf to a babysitter and exchange what looks for all the world like a few parting words before diving deep to hunt.
The exchanges last up to an hour.
Multiple whales overlap, respond, build on each other’s contributions.
There is measurable conversational timing.
Rhythm.
Turn-taking that occasionally breaks down when two whales click simultaneously.
Which Gero notes appears to be socially acceptable in sperm whale culture.
The way interrupting can be among close family.
Consider what that actually requires from a brain.
To produce the right coda for the right whale in the right moment requires not just memory and acoustic control.
It requires a theory of mind, an awareness that the other individual has a perspective different from your own.
And that what you say should be calibrated to reach them specifically.
That capacity was once considered uniquely human.
Researchers studying sperm whales are no longer certain that framing holds.
There are dominant voices and quieter ones.
There are exchanges that look like disagreements and exchanges that look like reassurances.
There are interactions that occur only between specific individuals within a group and others that appear to include the whole family.
The social architecture visible at the surface maps onto acoustic patterns measurable below.
The conversations aren’t random.
They’re structured around relationships.
The brains sustaining those conversations are the largest of any animal on Earth.
Six times the mass of a human brain.
Sperm whales live in matrilineal families with multi-generational social structures, cultural traditions transmitted from grandmother to mother to calf across centuries, and coordinated deep-water hunting strategies executed in pitch darkness across hundreds of meters of open ocean.
Behaviors that require real-time multi-party information exchange to work at all.
Sperm whales communicate before they hunt and when they socialize.
So, we we think that there might be more meaning to the their communication system.
>> These are not simple animals.
They never were.
Science was not listening closely enough to understand what they were actually doing.
Structure is not meaning yet.
Here is where precision matters.
The AI has mapped the grammar.
It has not translated a single word.
Structure is not meaning.
The phonetic inventory has been identified.
The combinatorial rules have been modeled.
The vowels have been cataloged.
Individual whales have been profiled by their acoustic signatures.
The conversational patterns between family members have been documented across years of recording.
All of that is real, verified, and peer-reviewed.
What remains completely unknown is what any of it encodes.
What does a specific coda sequence mean? What is the difference in content between a rising tempo pattern and a falling one? What does it mean when a mother click sequences at length to her calf versus when a matriarch addresses the whole group? Nobody knows.
And that gap between knowing the architecture of a language and knowing what it says is the entire next phase of the project.
Meaning requires correlation.
It requires building a data set deep enough that patterns emerge between specific vocalizations and specific observable behaviors.
Matching what is said to what is visibly happening in that exact moment.
Next-generation biologger tags are logging audio from three synchronized microphones alongside depth, GPS coordinates, body orientation, and the acoustic output of every nearby whale simultaneously.
All feeding into a growing behavioral archive that researchers are cross-referencing against decades of observational records.
Consider what seven decades of unread language actually means.
A sperm whale can live 70 years.
The matriarch of a family alive in the ocean today may have been born before sonar existed.
She has spent her entire life communicating with her family in a system that science only just admitted might be language.
Whatever she has said across those decades, whatever knowledge she transmitted, whatever warnings she issued, whatever she told the young ones about how the world works, all of it happened in a language we could not read.
Project CETI is not entering that conversation as translators with a finished dictionary.
It is entering as students who have just learned the alphabet, trying one behavioral correlation at a time to figure out what the words mean.
And every hour of new recording makes the dictionary slightly less blank.
What changes if they’re right? In the 1960s, researchers Roger Payne and Scott McVay proved humpback whales sing complex evolving songs.
Before that discovery, most people thought of whales as large unremarkable marine animals.
Commercially valuable, ecologically present, but not cognitively significant.
That one finding changed the public’s entire mental model of what a whale was.
It sparked the Save the Whales movement, led to the Marine Mammal Protection Act, and pulled multiple species back from the edge of extinction.
One discovery about whale intelligence rewrote international conservation law.
Proving sperm whales have language, not songs, not signals, but a compositional linguistic system with vowels, phonemes, and grammar could make that look modest.
Gero explains why the framing matters.
When we can talk about how important whale grandmothers are to their families, or the importance of being a good neighbor, or cultural diversity in whale communities, that resonates with people in ways abstract conservation arguments never can.
People protect what they relate to.
Language creates relation.
And if the public comes to understand that sperm whales are not just intelligent animals, but communicating ones, beings that tell each other things, that remember things, that pass knowledge down through generations the way human cultures do, then every whale becomes not a marine statistic, but a speaking member of a community.
These animals could be the most intelligent beings on this planet.
They have a neocortex and spindle cells, structure that in humans control higher order thoughts, emotions, memory, language, and love.
That shift in perception has legal consequences, policy consequences, and moral consequences that researchers are only beginning to map.
The stakes go further.
Commercial whaling killed hundreds of thousands of sperm whales in the 20th century.
Ship strikes kill dozens every year.
Underwater noise pollution from industrial shipping and military sonar disrupts their communication.
Broadcasting interference into the acoustic space where an entire community holds every social exchange, every family conversation, every transmission of cultural knowledge, every day for decades.
If that community speaks a language, then the noise isn’t just environmental damage.
It’s something closer to the systematic destruction of a culture’s ability to communicate.
Legal scholars are already asking what follows from this science.
If sperm whales use language to discuss ideas, maintain relationships, and transmit cultural knowledge across generations, do they qualify for legal personhood? Do they have rights the law has never been forced to confront? The AI is still running.
The data keeps arriving.
The day is coming when a researcher sits at a screen and sees not just structure, but words.
The question nobody is ready for.
What has that matriarch been telling her family for 70 years? What did a sperm whale mother say to her calf the first time a human ship passed overhead? What do families communicate about the warming water, the disappearing prey populations, the noise that now fills every ocean basin on Earth and has nowhere left to go? The sperm whales alive in the Caribbean today have watched the ocean change in ways no human has documented from the inside.
They have lost family members to ships they could hear coming and could not escape.
They have watched the acoustic environment they rely on for communication become increasingly saturated with industrial interference.
If they have language, and the evidence now says they do, then they have been talking about all of this.
They have been talking about us.
That is the weight of what project CETI is reaching for.
Not just the translation of an animal communication system, the decoding of a perspective on this planet that has existed for 30 million years, that has watched humans arrive and industrialize and alter the ocean.
Project CETI will build an open-source platform where we will make our data sets available to the public, encouraging the global community to come along on this journey for understanding.
And that has had something to say about it in a language we are only now learning exists.
The alphabet has been found.
The vowels have been identified.
The first message was transmitted long before any of us were listening.
A million times across millions of years in a language we are only now learning to read.
We don’t know yet what it says.
What do you think they’ve been saying? Drop it in the comments.
Subscribe because the moment project CETI I decode the first full sentence, you’ll want to be here when it surfaces.
And if this one got under your skin, the next video’s already waiting.
News
Three US aircraft carriers were destroyed and sunk in the Strait of Hormuz after a mysterious fighter jet attack.
The Strait of Hormuz is a narrow yet highly significant maritime corridor that connects the Persian Gulf to the open ocean. A substantial portion of global energy shipments passes through this route every day, making it a focal point for international trade and security. Due to its importance, the region is constantly monitored by multiple […]
“What Patton Said When Asked to Court-Martial the Soldiers Who Killed SS Guards”
April 29th, 1945. Dao, Germany. American soldiers walked through the gates of the concentration camp. What they found stopped them cold. Piles of bodies. Thousands of skeletal prisoners barely alive. Gas chambers still warm from use. The smell of death everywhere. And in the corner, 50 SS guards, hands raised, surrendering. The American soldiers looked […]
The World Is Quietly Building New Routes To Replace The Strait of Hormuz
a map drawn by people who understand that the most dangerous place on Earth right now is not a battlefield. It’s a body of water 21 miles wide, a 100 miles long. And every single day, 1if of the world’s oil passes through it, the straight of Hormuz. For decades, that narrow blue strip between […]
1 MIN AGO: Iran Unleashes 800 Drones — IDF Tanks WIPED OUT in Seconds!
They never heard them coming. At 3:12 in the morning, while tank crews slept and radar operators stared at quiet screens, 800 autonomous drones dropped below detection altitude simultaneously across a 40 kilometer front stretching from the Jordan Valley to the northern Negev. No sirens, no alerts, no time. The first Marava Mark IV, Israel’s […]
How the U.S. Navy’s Laser System Is Reshaping Anti-Drone Warfare
$3.50.That is the cost of firing one shot from the US Navy’s latest laser weapon system deployed in the Middle East right now. Compare that to a single Patriot interceptor missile, which cost the American taxpayer over $3 million per shot. And Iran launched 2,100 Shahed drones in under three weeks, according to RBC Ukraine, […]
This is terrifying! Iran fired artillery and 125 intercontinental ballistic missiles, latest news today.
Intercontinental ballistic missiles, often referred to as ICBMs, represent one of the most advanced and closely monitored categories of military technology. These systems are designed to travel vast distances, often across continents, and are typically subject to intense surveillance by global monitoring networks. Satellite systems, radar installations, and early warning technologies operated by multiple nations […]
End of content
No more pages to load




