In this article, an anonymous author shares their journey and insights on ranking a parasite for high KD (keyword difficulty) and high traffic keywords by reverse-engineering Google's algorithm.
When the author was new to SEO, he focused solely on creating helpful content without much concern for backlinks or authority. Despite spending 3-4 days researching and writing each article, his pages rarely ranked. Meanwhile, seemingly low-quality pages from news or general niche sites would effortlessly claim top spots.
Seeing this, the author realized that well-established, high-authority websites often rank easily on Google — even if their content isn't closely related to the search terms. Motivated by this discovery, he decided to dig deeper into how this works and share what he learns with others.
A common scenario
A typical exchange on forums might go like this: A newbie, Little Manny, asks, “I’ve created a coupons site a year ago but can’t find it on Google. Someone suggested I need backlinks to rank, so I bought 1 million backlinks for $5. When will my site rank?” A SEO expert, Brad Rolex, replies, “Hi Little Johnny, $5 backlinks won’t help. You need to spend at least $1,000 on quality backlinks and another $1,000 monthly for top-notch articles. Google’s senior webmaster, Mr. MoFo, says content is king.”
While the advice isn’t wrong, the author notes that there’s no one-size-fits-all solution. Each site needs a tailored approach based on specific keywords. He suspects that Google uses a domain authority metric, allowing high-authority sites to rank without much relevant content.
To test this theory, today's author, let’s call him Jack (just for the sake of convenience), wanted to see if I can rank web pages with the help of domain's trust & authority.
Requirements
Steps
Observations
Jack's Perspective
Jack believes that mainstream niches like weight loss and finance are challenging to rank in, though he hasn't tested them yet. As he's focusing on CPA, he thought it best to explore relevant, albeit spammy, niches such as game hacks, freebies, and dating. He's keeping his chosen niche under wraps but has given a hint with these examples.
Parasites
Jack found the world of parasites vast and difficult to navigate. He scoured hundreds of keywords to find parasites with consistent rankings. Some were visible for specific keywords but vanished elsewhere. Jack needed a parasite that could rank across all niches, allowing him to use the same techniques for all his projects. He narrowed it down to Google Sites and Medium, ultimately choosing Google Sites due to its preference by Google and trackability on Google Search Console.
Jack shared an intriguing trend from Semrush, showing a surge in organic keywords for Google Sites after the April core update:
This growth wasn't uniform, with almost all keywords pushing Google Sites into the top 50 results. Jack attributed this to Google's preference for authority sites covering numerous topics.
Content
A disciple once asked Lord Buddha, "What's the secret of success?" Buddha smiled and replied, "Keywords."
Jack understood that posting random content wouldn't cut it; he needed to write content that already had demand. The chosen keywords had high traffic but also high difficulty and competition.
Knowing Google's algorithm can't read, Jack decided to post spun content. He found low authority sites with fairly long, readable content, and spun it using the first result from a "free article spinner" search. After submitting his site to Google Search Console, Jack left it for three months to gain momentum. Some sites ranked on their own, while others needed optimization. Jack humorously noted that indexing could take time, joking about an urban myth of a man who woke up from a coma to find his site still not indexed.
Optimize content
Don't mistake this step as rewriting all the content. The goal is to optimize the content for more keywords to drive more traffic. The author employs two main strategies:
Once Jack pinpointed these keywords, he got creative. He either added a few new sections with catchy subheadings or cleverly wove these keywords into the existing text. This tweak is key because it plays right into Google's hands - the search giant loves content that's stuffed with relevant keywords. By doing this, Jack ensured his pages were talking the language Google understands best, making it easier for the algorithm to spot and rank his content.
Here's the quick rundown:
Steps
Observations
Reverse engineer Google’s algorithm
We’re now at the clickbait part. Jack humorously notes that they learned from YouTube videos to place the juicy details at the end, encouraging more engagement. Here’s how they break it down:
Understanding user intent
Google aims to provide the most relevant results to users. If it fails, users will abandon the search engine. The analogy given is ordering beer at a bar and getting hot milk instead. Google needs to understand the user's location, the types of results they click on, and when they close their browser. This helps optimize results for specific user groups and keywords.
To rank for any keyword using high-authority sites, you need to serve what the user is looking for. How do you know what the user wants? The answer lies in using Google's own tools to understand search intent.
Leveraging Google’s tools
Have you ever searched for the meaning or synonym of a word on Google? If so, you’ve likely noticed that Google often displays the information directly on its results page, eliminating the need to visit external sites. This practice helps Google keep users on its platform longer, showing more ads and generating more revenue.
Google understands what users are looking for and tries to provide it directly on the results page. For instance, if someone searches for "Netflix poster," Google might show an image carousel at the top of the results. This indicates that users are primarily looking for images, and Google prioritizes sites with relevant images.
Applying the strategy
The author looked for snippets in search results and provided similar content on their parasites. If their keyword didn’t have snippets, they collected snippets from related keywords. For example, if the user is searching for "Netflix poster," Google will prioritize sites with relevant images.
If a site has a mix of Netflix, Amazon Prime, and HBO images, it may not rank because Google knows users specifically want Netflix posters.
By providing targeted content that meets user intent, the author managed to rank their high-authority site without backlinks and with average (or even spun) content.
Results
The author’s site saw impressive traffic growth, with average daily impressions ranging from 250k to 300k. While the CTR was lower due to users avoiding Google Sites in their niche, the results were significant.
Reproducibility
Jack confirms that the results are reproducible, having used the same technique to rank several other sites. One particular site, despite facing stiff competition, managed to rank well. Here are some additional examples:
Problems
However, there are some challenges:
Conclusion
This story shows that if you've got a site with a good reputation, you can climb the ranks pretty easily just by giving people what they're looking for. Even though Google's way of picking winners is pretty tricky, with the right tricks up your sleeve, you can really make a splash.