In the early days of SEO, long before we spoke about entities, semantics, or language models, I worked like a researcher. I would sit for hours in front of an Access database I built for myself, documenting every tiny change I made on a website what worked, what didn’t, and what moved a page even one single position. This was the DNA of my work: experiment, document, learn.
One day, I took four websites in the signage industry, all direct competitors, and began analyzing their titles and meta descriptions. Very quickly, a phenomenon jumped out at me that repeated across all of them with almost mathematical precision: The primary keyword appeared at the beginning of the title, and in the meta description, it appeared at the start of the sentence and then again exactly after four words. Not three, not five—four. It was so consistent that all four sites presented the exact same pattern.
I decided to test this on my own site, which was stuck on the second page at the time. I changed the title and meta description according to the formula I identified: keyword at the start of the title, keyword at the start of the meta, and again after four words. Four days later—the site jumped to third place for a very competitive term.
Of course, I documented everything. It was one of those moments where you feel like you’ve discovered a small “physical law” of Google. And it truly worked… until Google changed the algorithm again, and that law became irrelevant almost overnight.
But that experiment taught me something much bigger: Google may change, but the ability to identify patterns, test them in the field, document, learn, and adapt—that is what stays with you for your entire career.
What Changed in Google — and Why the Method Stopped Working
After years where keyword matching was almost an exact science, a deep shift began to occur within Google. It wasn’t a single update—it was an evolution.
-
Hummingbird (2013) – Google begins to understand sentences, not just words Google moved from Keyword Matching to Semantic Matching. Instead of asking “Does the word appear?”, it started asking: “What does the user actually mean?”
-
RankBrain (2015) – Google begins to learn on its own RankBrain was introduced, Google’s first algorithm based on Machine Learning. It no longer relied only on matches—it started to guess, learn, and understand.
-
BERT (2019) – Google begins to understand human context Google took another leap, starting to understand context, intent, meaning, and the relationships between words.
In other words: Google stopped being a search engine for words and became a search engine for meaning.
What does this mean for my experiment from the past?
The formula that worked for me then—keyword at the start of the title + keyword at start of meta + repetition after four words—was correct for an era where Google “counted” words. But when it started “reading”—that formula became irrelevant. And even dangerous.
Why Keyword Stuffing Hurts Today — and What Google Does Instead
When I realized the old method was not only unhelpful but potentially harmful, I started checking what was actually happening. Today, when I repeat a keyword twice at the start of a meta description, especially in a mechanical way, Google flags it as over-optimization. In some cases, it ignores my meta entirely and replaces it with random text from the page.
This is exactly what Lily Ray talks about. In the analysis of her interview on my site, fayzakseo.com, she says a sentence that summarizes everything I see in the field:
“Many sites are falling in rankings not because of a lack of SEO — but because of too much SEO.”
That sentence hit me hard because it describes exactly what I experienced: what worked for me once—today looks to Google like an attempt to “push” the algorithm.
Furthermore, it hurts CTR. The user of 2026 is much smarter. When they see a phrase repeated unnaturally, it feels like an ad rather than a real answer. And when they feel like they are being sold to—they simply don’t click.
The meta description, once a technical SEO tool, has become a UX tool. I no longer write it for Google—I write it for the user. Its goal today is to attract, explain, build trust, and reflect the content—not to cram keywords.
Google’s NLP understands context, intent, meaning, and relationships. It knows what “signage” is. It knows what a “business” is. It knows the connection between them. Today, Google understands what I once had to spoon-feed it.
The Shift from Keywords to Entities — and How it Changed the Game
As I delved deeper, I realized the real revolution wasn’t an “algorithm update,” but a change in perception. Google stopped looking for keywords—and started looking for Entities.
To Google, “signs for business” isn’t a phrase. It’s a relationship between entities: product, business, need, context, and user intent. Once I understood this, I understood why my old methods failed: I was busy with words—while Google moved to understanding meaning.
But there is a deeper layer: not just Entity Recognition, but Entity Validation. This is a whole world in itself, related to how Google connects identity, behavior, professional connections, and real-time information. I wrote a full article about this on dev, including examples and architectural analysis: “Gmail is Not a Mailbox – It’s Your Sensor Inside Google’s Matrix“
