Content Marketing Blog

Website SEO: Why It Needs To Be Perfect

Welcome to the first step on your journey towards SEO perfection. We say “towards” because your website SEO will never be perfect. Sorry if you feel horribly misled. But please bear with us.

What we can do in this blog post is to show you why you should still strive for perfect website SEO. Even if you know you’ll never quite get there.

So, with expectations suitably managed, let’s jump right in. Here’s what you’ll get if you keep reading:

First, we’ve picked our landmark moments in the evolution of Google’s search algorithm. Ranking in Google’s search results is a moving target, which is part of the reason why the quest for SEO perfection never ends. We’ll look at how some of the more significant changes affected SEO.

Second, we’ve dug out some examples of what happens when website SEO is not just imperfect, but downright disastrous. Check out some classic SEO fails and what you can learn from them in part two.

Third, we’ve got some SEO hacks to share. There are a million blog posts on SEO hacks, but we’ve selected just a handful of the less obvious tactics you can try on your site right now.


As we said in the intro, Google’s search results are a moving target. The tactics you need to succeed at SEO have to constantly evolve. That’s because the algorithm that decides what should rank where is constantly being tweaked and updated.

If you want a comprehensive rundown of the 200 or so algorithm changes dating back to the year 2000, then look no further than Moz’ Google Algorithm Search History. But if you’re interested in what we consider to be the really big moments then keep reading.


Website SEO Google Panda Update

We know what you’re thinking: Why the fox picture?

Like every big Google algorithm update, Panda has generated millions of pieces of content. All those blog images and memes and not a red panda in sight. Until now.

What was Panda?

That should be: what is Panda? Because Panda is a series of updates rather than a single event. Moz has tracked no fewer than 28 Panda updates since February 2011.

Panda first arrived in year one for us. We had just opened up in Sydney and were in the very early days of building our content marketing agency.

Our USP at the time was our ability to generate fresh, original content at scale. So, Panda, as you can imagine, was pretty helpful. SEOs were falling over themselves to get their duplicate content taken down and their lightweight pages beefed up with relevant, useful content.

That’s because Panda’s primary targets were pages or entire sites that offered little or no unique value to users.

Google said the first Panda update affected up to 12 per cent of English language queries. Some websites lost enough traffic to put their businesses at risk literally overnight.

The impact was significant enough for SEO to make the mainstream news. As The Guardian cleverly put it: Google had “put some websites on the endangered species list”.

Why does it still matter?

Panda was a line in the sand for Google’s quality drive. They knew that if search results were packed with lightweight, unhelpful or completely useless content they would lose their audience.

The Panda updates have been the tip of the spear as Google continues to find and rank pages that users want to see.


Website SEO Google Penguin

Baby penguins are another under-used stock image resource when covering Google. Righting that one here too.

What was Penguin?

Penguin, like Panda, was a series of Google updates. Where Panda was about weak content, Penguin was about black hat SEO.

Black hat SEO is a broad church. And there’s lots of grey around the edges. But it usually refers to SEO tactics that break Google’s guidelines.

Penguin went after two popular black hat SEO practices: link manipulation and keyword stuffing.

When Google first launched, links were its USP. Rather than relying almost solely on words on the page, Google’s algorithm used inbound links. If another site is linking to this page, it must be a good resource for whatever the anchor text says.

No surprise then that Google has gone to some effort to protect the integrity of inbound links. Penguin pecked away at link farms (sites set up purely to give out links) and suspicious link graph signals (too many links with your favourite keyword as the anchor text).

Penguin also flagged pages with keywords crammed into the bits everyone knew Google paid more attention to (page title, H1 tags, image alt tags etc).

The impact of the first Penguin update wasn’t as widespread as Panda. It affected around three per cent of English language queries.

But for the sites that got hit, it was no less devastating. We helped one particular client with the clean-up from another agency’s paid link scheme. They had dropped from #1 to nowhere for their most important keyword. The impact on their business was almost fatal.

Why does it still matter?

It still matters because it can still get you. Penguin was incorporated into Google’s core search algorithm. So it now runs in realtime.

That means if you get caught out stuffing keywords into your title tags or cruising bad neighbourhood for links it will be thanks to Penguin.


Website SEO Google Freshness

We know. It should have been cookies. Sorry.

What was Freshness?

If you asked 100 SEOs to name a Google algorithm update then Panda and Penguin would be your top answers for sure. But while Freshness didn’t anywhere near the same attention and digital column inches, Google said at the time that it affected 6 to 10 per cent of search results.

Freshness, as the name suggests, was about rewarding pages with more recent content. It applied to particular queries where the user was likely to be looking for something new or recently updated.

Why does it still matter?

We included Freshness in our list because it drove two important content strategy trends.

First was to use news stories in our clients’ industries as hooks for search-friendly blog posts. During big events or when a story was trending, Freshness meant that new content could punch well above its weight.

Second was the internal linking strategy HubSpot later dubbed “pillar and cluster”. SEO-savvy news sites would throw up landing pages for trending topics and then link back and forth to all their news stories on that topic. We did the same with clients’ landing pages and blogs. That worked largely because of Freshness.


Website SEO Google Mobilegeddon

Because all the mobiles would be broken during the “mopocalypse”.

What was Mobilegeddon?

Google’s big mobile update in April 2015 was heavily trailed. Months earlier, the company had said it was planning a major shake-up of its mobile search results.

But what really caught everyone’s attention was a comment from Google analyst Zineb Ait Bahajji who told attendees at an industry event that the impact would be bigger than Panda and Penguin.

As this Moz study 7 days in demonstrated, Mobilegeddon was a slow burn. But it makes our list because it represented Google’s first big move against pages that provided a poor user experience on mobiles.

This came a year after then Google CEO Eric Schmidt made his famous “mobile first” declaration. Mobilegeddon was about making that commitment a reality for sites whose SEO started and stopped on desktops.

Why does it still matter?

We work with a lot of B2B websites and often see a damaging mobile traffic feedback loop.

That’s when website owners misinterpret a low number of sessions on smartphones and tablets for a lack of demand. Often times the site is too hard to use on those devices so its pages just aren’t showing up in mobile search results.


Website SEO Google Rankbrain

Not an original idea this one. But fancy-looking brain image.

What was Rankbrain?

Rankbrain was and is Google’s flagship machine learning project.

It’s not artificial intelligence exactly (but then hardly anything that calls itself AI is actually AI). But it is Google handing over some level of control from bodies to bots.

Rankbrain can adjust the weighting of the various signals Google uses to organise search results. This juggles the results around with the goal of improving user satisfaction.

When page one results get a high bounce rate, low dwell time or other negative engagement metrics that suggests something is wrong. That link shouldn’t be among the top results. It’s Rankbrain that steps in to fix it.

Why does it still matter?

Back in 2015, when it was first rolled out, Rankbrain was pointed at the 15 per cent of queries Google handles every day that have never been searched before.

Now, Rankbrain is one of Google’s most important tools in its ongoing effort to serve up results users want and expect.

It should mean that if you’re a white hat SEO, creating genuine, valuable content for your audience, your strong engagement metrics should filter through to better rankings. If they do, you have the machines to thank.



What happened?

Essentially this boiled down to a Penguin violation before Penguin existed.

This was February 2011. JC Penney had been smashing the competition in the run up to the previous Christmas, ranking on page one for a range of highly competitive product keywords.

They either had the best website SEO game in town or they were breaking the rules to gain an advantage.

There were a number of notable elements to this story. First was that it started with an expose by The New York Times. They enlisted an SEO expert to uncover what reporter David Segal called JC Penney’s “dirty little secret”. The answer was of course link farms.

Second, was that it took The New York Times story for JC Penney to get found. Their black hat tactics had worked. It was only when the story broke that Google imposed a manual penalty (although Google’s head of anti-webspam, Matt Cutts, claimed at the time the algorithms had already started to work it out).

The third noteworthy piece was what happened next. JC Penney’s manual penalty saw them booted off page one. Their rankings and organic traffic tanked as a result. But just 90 days later, as this Search Engine Land follow-up piece confirmed, they were out on parole.

What can we learn from this story?

The first lesson is that in 2011, Google was largely reliant on the myth that it was all-seeing and all-knowing. In reality, black hat link building tactics were very hard for its algorithms to spot and adjust for.

Second, if you engage in black hat SEO, outsource it to an agency so you have someone to fire when you get caught.

And third, don’t assume that JC Penney’s 90 days in the organic search wilderness is indicative. If you’re a small website that gets dinged for a breach of Google’s guidelines on that scale there’s no telling when you’d recover – if at all.


What happened?

In our section on Google’s Panda updates we mentioned that the effect on some websites was pretty devastating. There were lots of stories at the time about small ecommerce sites going out of business because their organic traffic had disappeared overnight.

This was because e-commerce sites would often use duplicate product descriptions. If you were a small business stocking thousands of items it’s understandable that you might just import the descriptions provided by your suppliers.

But post Panda, sites that had invested the time and effort in creating unique, in-depth product descriptions got a big uplift. These sites found themselves outranking much bigger rivals for the first time.

And because Panda is a rolling update this is a story that has repeated itself. Take Panda 4.0, for example, which was rolled out in May 2014. One of the highest profile casualties of that particular iteration was ebay, the online auction site.

Some estimates put ebay’s loss in organic rankings at 80 per cent. The impact was especially damaging because ebay had made a strategic shift away from paid search to focus on SEO. It was therefore much more reliant on traffic from organic search when Panda 4.0 landed.

What can we learn from this story?

There are two useful lessons from ebay’s Panda 4.0 experience. The first is that unique content that offers value to users is always a good investment for SEO.

The second is that it is dangerous to rely too heavily on one single source of website traffic. Even if you are a website SEO master, you remain at the mercy of Google’s algorithm updates.

That makes it important to diversify. You can do that by investing in paid search and social media ads. Building up your referral network. And leveraging marketing automation to run a proper email strategy.


What happened?

It’s fair to say the iconic toy shop chain Toys R Us has had bigger problems than SEO in recent years.

But back in 2009, they had SEO nerds all excited about what happens when you do a 301 redirect of an entire domain.

Toys R Us had just paid USD $5.1 million for the domain name. According, which sells domain names, that was the 25th most expensive domain name deal of all time. is the current #1 at just shy of USD 50 million. wasn’t just a parked domain. It was a living, breathing website with strong domain authority and useful organic rankings for some valuable keywords.

At first, Toys R Us ran it as a separate website, but after a few months took the decision to redirect the entire domain to

What Toys R Us did was to effectively remove from Google’s index without any obvious benefit – at least from an SEO standpoint – for its main domain. This on the face it looks like a very expensive SEO fail.

What can we learn from this story?

The problem here is that redirects do not necessarily pass on SEO value from old page to new page.

If you have a page about premium widgets and you set up a 301 redirect to your new, improved premium widgets page, chances are the new page will keep the old page’s standing in organic search.

But when there is a significant difference between the content on your new page and the content on your old page, there’s no guarantee that will be the case.

And that’s pretty logical. If you made a really great page about apples, it ranked because it was about apples. If you redirect that page to a new page about oranges then it shouldn’t keep ranking for all those old apple keywords.

This is important to keep in mind if you run more than one website, acquire a new site in a merger or fancy breaking into GoDaddy’s top 25.


What happened?

Doorway pages are a black hat SEO technique that shows one thing to search engines and another to users.

The most famous case of a doorway pages scheme getting detected and punished was BMW. Back in 2006, Google delisted the car giant’s German website from its index. The BBC called it a “death penalty”.

But a much more interesting example of this SEO bait and switch is when Google caught Google doing it.

It was 2010 when bloggers noticed some Google Ads (or AdWords as it was then) help pages showing content to search engine crawlers that differed from what was actually on the page.

Important to note here that it’s unlikely Google was trying to game its own algorithm. And if it was the company was big enough even in 2010 for the AdWords and anti-spam teams to be as separate as two completely different businesses.

More likely this was a mistake. That’s how a Google spokesperson described it to Search Engine Land at the time.

What can we learn from this story?

Website SEO should be about making your pages easier for Google to crawl and index. But any tactics designed to trick Google into ranking your content higher than it deserves will eventually backfire.

What this story shows is that you can go black hat by accident. Breaking Google’s rules not because you have a nefarious plan to get ahead of the competition. But because you made a mistake.

It highlights the importance of employing or outsourcing to genuine website SEO experts. They can ensure your strategy stays white hat at all times.



Brian Dean of Backlinko developed what he called the Clickthrough Rate Method. The idea is to play to the same signals that Rankbrain is thought to be looking at when it stirs the pot to boost user satisfaction.

You’ll remember from part one, Rankbrain is the machine learning algorithm that adjusts the weighting of ranking factors based on engagement metrics. So, if content in position #10 has great engagement, bump it up. Content in position #3 has poor engagement, push it down.

Dean’s idea is to look at the top Google Ads for your target keyword. Then use some of the same language in your headlines and meta descriptions of the pages you want to rank organically.

The assumption is that the copy used in ads will have been tested and refined over time to maximise clickthrough.

If you can transfer some of what makes those ads work into your organic links, you should get more clicks. If the content is good enough, users will engage with it. And Rankbrain should do the rest.


Jeff Baker, one of our friends at Brafton, has done extensive research on the relationship between content depth and organic rankings.

Content depth is about the number of popular and important topics a particular piece of content features.

So, rather than focusing on word count to improve their content, SEOs should focus on topics.

You start with pages on your site that already rank in search. The idea being that it is usually easier to move up a few spots in Google’s search results than to hit page one from nowhere.

Next step is to find the topics.

You can do that manually by reviewing pages that are currently ranking among the top results for your target keyword.

Or you can use a tool like MarketMuse or SEMrush to do the legwork for you.

Once you have identified the topics most commonly featured by the top-ranked pages, you can adjust your content to make it the most comprehensive resource for your target keyword.


This is apparently Neil Patel’s favourite organic traffic hack.

We like it because it’s similar to Jeff’s work on content depth and it’s about creating richer, more useful content.

The idea here is to use Google’s suggested searches to find new topic ideas.

So, if you’re creating a cornerstone blog post on purple widgets, you would start typing “purple widgets” into Google and make a note of the drop down suggestions.

You can then incorporate content that answers these suggested queries into your cornerstone blog post. Or you can create dedicated posts to target them that link to and from that cornerstone piece.

If you don’t fancy researching suggested searches manually, Patel has a free tool that can do it for you called Ubersuggest.

Adam Barber
Adam Barber About the author

Adam is one of Castleford's founders and remains actively involved in the day-to-day running of the business. He started out as a writer and still contributes regularly to our blog, covering SEO, CRO, social media and digital strategy.

Read more of Adam's articles