1. Google’s Introduces Policy Circumvention – Google has added a new spam policy to its search spam policies – “Policy circumvention.” In short, if you take any action you take to bypass the other Google Search spam or content policies such as creating new sites, using other sites or other methods to distribute that content, maybe on third-party sites or other avenues then Google will restrict or remove the content from showing up in search. Here is what Google wrote:
“If you engage in actions intended to bypass our spam or content policies for Google Search, undermine restrictions placed on content, a site, or an account, or otherwise continue to distribute content that has been removed or made ineligible from surfacing, we may take appropriate action which could include restricting or removing eligibility for some of our search features (for example, Top Stories, Discover). Circumvention includes but is not limited to creating or using multiple sites or other methods intended to distribute content or engage in a behavior that was previously prohibited”
2. Google’s Advice On When You Should Move Your Blogs To A Sub-Domain – John Mueller of Google recently shared his advice on when you should add blogs to a Sub-Domain. John shared that he will move blogs to a subdomain over a www when he thinks the content on the subdomain can live on its own.
He said “my way of thinking with regards to subdomains is that it depends on what you’re trying to do. Is it content that’s meant to be tightly connected to the main site? Then put it on the main site. If you want the content to stand on its own, then a subdomain is a good match.”
He also shared that there are technical considerations to think about outside of SEO. He said “There’s also the technical side-effect of subdomains sometimes making things a bit more complicated: verification in search console, tracking in analytics, DNS, hosting, security, CSPs, etc.”
Lastly, John added “To be clear, I think it will affect rankings of the new content, but ultimately it depends on what you want to achieve with it. Sometimes you want something separated out, sometimes you want to see something as a part of the main site. These are different situations, and the results will differ.”
3. Google: 60% Of The Internet Is Duplicate & Prefers https – Gary Illyes from Google shared during Google Search Central Live in Singapore that 60% of the content on internet is duplicate. To find duplicates, Google compares the checksum generated from the main content and if the checksum matches then the content is duplicate. Lastly, Gary mentioned that Google will always pick a https url over http.
Ensure that you have https on your website and focus on producing something way more unique and useful than most of what is out on the internet.
4. Google Re-confirms That E-A-T Applies To Every Single Search Query – During recent SMX Next event, Hyung-Jin Kim, the Vice President of Google Search (who has been working on search quality for the past 20 years and leads up core ranking at Google Search) reconfirmed that E-A-T is used in every single query, it is applied to everything Google Search does. “E-A-T is a core part of our metrics,” he added, explaining that it is to “ensure the content that people consume is going to be, is not going to be harmful and it is going to be useful to the user.” Here is the transcript of what he said exactly:
“E-A-T is a core part of our metrics and it stands for expertise, authoritativeness and trustworthiness. This has not always been there in Google, and it is something we have developed about 10 to 12 to 13 years ago. And it is really there to make sure that, along the lines of what we talked about earlier, that is it really there to ensure the content that people consume is going to be, is not going to be harmful and it is going to be useful to the user. These are principles we live by every single day.
And E-A-T, that template, of how we rate an individual site based on expertise, authoritativeness and trustworthiness, we do it to every single query and every single result. So it is actually pretty pervasive throughout everything we do.
I will say that YMYL queries, the your money or your life queries, such as when I am looking for a mortgage or when I am looking for the local ER, those we have a particular eye on and pay a bit more attention to those queries because those are some of the most important decisions people can make, some of the most important decisions people will make in their lives. So I will say that E-A-T is has a bit more of an impact there but again, I will say that E-A-T applies to everything, every single query that we have.”
5. Google Publishes A Guide To Current & Retired Ranking Systems – You can find out which algorithms Google uses to rank search results and which ones are no longer in use with the help of a new guide to Google’s ranking systems. Furthermore, Google distinguishes between ranking “systems” and ranking “updates” in its most recent guide, using new terminology. RankBrain is one example of a system that is always operating in the background. On the other hand, an update describes a one-time adjustment to ranking structures.
For instance, when Google returns search results, the helpful content system is always active in the background, while it is subject to modifications to enhance its performance. Other examples of one-time adjustments to ranking algorithms include spam updates and updates to the core algorithm.
Here is the list, in alphabetical order, of Google’s ranking systems that are currently operational.
- BERT: Short for Bidirectional Encoder Representations from Transformers, BERT allows Googe to understand how combinations of words can express different meanings and intent.
- Crisis information systems: Google has systems in place to provide specific sets of information during times of crisis, such as SOS alerts when searching for natural disasters.
- Deduplication systems: Google’s search systems aim to avoid serving duplicate or near-duplicate webpages.
- Exact match domain system: A system that ensures Google doesn’t give too much credit to websites with domain names that exactly match a query.
- Freshness systems: A system designed to show fresher content for queries where it would be expected
- Helpful content system: A system designed to better ensure people see original, helpful content, rather than content made primarily to gain search engine traffic.
- Link analysis systems and PageRank: Systems that determine what pages are about and which might be most helpful in response to a query based on how pages link to each other.
- Local news systems: A system that surfaces local news sources when relevant to the query.
- MUM: Short for Multitask Unified Model, MUM, is an AI system capable of understanding and generating language. It improves featured snippet callouts and is not used for general ranking.
- Neural matching: A system that helps Google understand representations of concepts in queries and pages and match them to one another.
- Original content systems: A system to help ensure Google shows original content prominently in search results, including original reporting, ahead of those who merely cite it.
- Removal-based demotion systems: Systems that demote websites subject to a high volume of content removal requests.
- Page experience system: A system that assesses various criteria to determine if a webpage provides a good user experience.
- Passage ranking system: An AI system Google uses to identify individual sections or “passages” of a web page to understand better how relevant a page is to a search.
- Product reviews system: A system that rewards high-quality product reviews written by expert authors with insightful analysis and original research.
- RankBrain: An AI system that helps Google understand how words are related to concepts. Allows Google to return results that don’t contain exact words used in a query.
- Reliable information systems: Google has multiple systems to show reliable information, such as elevating authoritative pages, demoting low-quality content, and rewarding quality journalism.
- Site diversity system: A system that prevents Google from showing more than two webpage listings from the same site in the top results.
- Spam detection systems: A system that deals with content and behaviors that violate Google’s spam policies.