SEO Facts Revealed on Google Webmaster Central
One fact about SEO is that it continues to evolve. Each and every day that passes, another fact about SEO is revealed, another important tip is told, another story is learnt. SEO continues to change. Scooping through the Webmaster Central Hangouts, some revelations have become bare and we need to share them as they represent facts we should all take seriously about SEO and moreso, they are coming from Google, the masters.
Further reading:
5 Best Practices to Gain Top Ten Ranking in SERPs
A Complete Guide To Good SEO For Beginners
To be up to date with latest happenings in the SEO world and especially in respect of what is going in with Google, the Google Webmaster Central Hangout is an important source of information you should always keep an eye on.The beauty of it is that it is available for everyone, you can send questions or address them live directly to experts, and find insights straight from Google.
You can check Google’s working hours directly on their website and listen to all recorded videos, starting with 2012, regarding all kinds of topics, offpage SEO and onpage SEO facts included. Yet, we thought you wouldn’t want to listen to all the recorded hangouts, therefore we’ve made a summary for you as we’ve curated the most interesting of them all and tackled the following:
- You Can Get a Site Architecture Penalty
- Search Console Doesn’t Count Knowledge Graph Clicks
- Don’t Canonicalize Blog Pages to the Root of the Blog
- Applying Multiple Hreflang Tags to One URL Is Possible
- No Limit on the Number of URLs That Google Bot Can Crawl
- Now You Can Get ‘Upsetting-Offensive’ Content Flag
- Disavowing Is Still Necessary for a Post-Penguin Real-Time Era
- URL Removal Tool Affects All Domain Variations
- No Specific Limit for Keyword Density in Content
- Content Duplication Penalty Doesn’t Exist
- Better to Add Structured Data Directly on the Page Than Using Data Highlighter
- Have Relevant URL to Rank Your Images Higher in Google
- Low-Quality Pages Influence the Whole Domain Authority (DA)
- Google Works With Over 150,000 Users and Webmasters Against Webspam
- Google Knows Your Work Place and It Isn’t Afraid to Admit It
- Personal Assistant Search Optimization (PASO) Might Be the Future of SEO
- Google Search Console’s Metrics Get Fully Integrated Into Google Analytics
- Object Recognition Works in Combination With Image Optimization for Better Results in Google
- Google Site Search and Custom Search Within Site Will Become One
- HTTPs Ensures That the Information Users See Is What the Owner of the Site Provides
- URLs Should Have Less Than 1,000 Characters
- You Can Have Multiple H1 Elements on a Page
- New Content Should Be Linked High in the Site Architecture
- Google Says It’s Ok to Have Affiliate Links
- Google Will Treat Nofollow Link Attribute as a ‘Hint’
- The Number of Words on a Page Is Not a Ranking Factor
- Google Does Not Render Anything Unless It Returns a 200 Status Code
Further reading:
40 Best Free SEO Tools to Power Your Business
11 Most Important SEO Metrics For Google Ranking
1. You Can Get a Site Architecture Penalty
Does Panda take site architecture into account when creating a Panda score? was one of the questions answered on a Google Webmaster Central Hangout.
Google Panda is an algorithm that looks out to the overall quality of the website and its content. In a previous blog post, we detailed how this Google update (the Panda algorithm) can affect websites and how it could actually be a topical authority.
From a Google point of view, Google Panda is more of a general quality evaluation and it takes into account everything. The site architecture is included. If somehow it affects the quality of the site overall, you might get a Google Panda penalty for bad site architecture.
Panda penalizes websites for “low-quality sites” or “thin content sites” and aims to return results with higher-quality sites near the top of search results. Let’s say, for example, that your website has issues that affect the quality of the site, such as a bad structure – then you should take some time to improve that.
If you are in the position of making a website redesign, creating a new site from scratch or doing something similar that might affect the architecture of your website, you should take into consideration what John Mueller said.
Further reading: Useful URL Structuring Tips That Will Help Higher Ranking
2. Search Console Doesn’t Count Knowledge Graph Clicks
Another fact about SEO revealed on Google Webmaster Central Hangout is that the Search Console doesn’t count Knowledge Graph sidebar clicks or impressions.
John Mueller had this to say about that:
If you look for your company’s name and it appears on the sidebar with a link that goes to your website, it isn’t counted. On the other hand, he said that sitelinks should be counted.
3. Do not Canonicalize Blog Pages to the Root of the Blog
Another fact about SEO revealed in a Google Webmaster Hangout was about blog pages canonicalization to the root of the blog.
Somebody asked if it’s a correct procedure to set up blog subpages with a canonical URL pointed to the blog’s main page as a preferred version since the subpages aren’t a true copy of the blog’s main page.
Setting up blog subpages with a canonical blog pointed to the blog’s main page it isn’t a correct set up because those pages are not an equivalent, from Google’s point of view.
And even if Google sees the canonical tag, it ignores it because it thinks it’s a webmaster’s mistake.
This method has been overly used by lots of websites, but that doesn’t make it a good thing. Small or big companies make mistakes all the time.
4. Applying Multiple Hreflang Tags to One URL Is Possible
Another exciting SEO fact was unveiled after a Googler asked the next question:
Can you assign both x-default and en hreflang to the same page?
John Mueller gave a short and comprehensive explanation to this:
Yes, you can apply x-default, and en hreflang to the same page.
Also, it is possible to say to Google this is English for UK, and it is a default page. The x-default hreflang attribute doesn’t have to be a different page. Another thing you could do is to set multiple hreflang tags going to the page. For example, you can have one for UK English, Australian English, American English, and the UK English is the default one you want to use for a website.
Further reading: 10 Costly Search Engine Mistakes to be Avoided
5. No Limit on the Number of URLs That Google Bot Can Crawl
The idea that Google can crawl only 100 URLs is a myth.
That is one of the most interesting pieces of information among the facts about SEO exposed on Google Webmaster Central Hangout. Somebody asked if there is a limit to the number of URLs that Googlebot can crawl. John Mueller answered that there is no limit.
Up until now, we knew that Google could crawl a hundred pages from every website. There is no such thing. He also explained that they analyze the crawl budget. This means they try to see what and how much Google can and wants to crawl regarding the website itself.
The crawl budget topic was explained on the Google Webmaster Central Blog along with the factor that impacts this topic:
- Faceted navigation and session identifiers
- On-site duplicate content
- Soft error pages
- Hacked pages
- Infinite spaces and proxies
- Low quality and spam content.
Further reading:
Google’s New Content Guidelines and AI Generated Content
Deliberate Link Building Can be Offensive to Google
6. Now You Can Get ‘Upsetting-Offensive’ Content Flag
Google’s effort to stop fake news and offer qualitative results never ends. They are rolling an on-going process of improving search value.
Paul Haahr, ranking engineer at Google, even said: “they’re explicitly avoiding the term ‘fake news,’ because it is too vague”.
Google is continuously trying to offer qualitative results and for that, they have worked with over 100,000 people to evaluate search results.
The people involved go under the name of Google search quality raters due to the fact that they conduct real studies and they are actually searching on Google to rate the quality of the pages that appear in SERP, without altering the results directly. These raters use a set of guidelines that have an entire section about the “Upsetting-Offensive” content flag that they can use.
A content could be flagged under the next circumstances:
- Has content that promotes hate or violence against a group of people based on criteria including (but not limited to) race or ethnicity, religion, gender, nationality or citizenship, disability, age, sexual orientation, or veteran status.
- Contains content with racial slurs or extremely offensive terminology.
- Includes graphic violence.
- Explicit how-to information about harmful activities (e.g., how-tos on human trafficking or violent assault).
- Other types of content which fall under the upsetting or offensive category.
7. Disavowing Is Still Necessary for a Post-Penguin Real-Time Era
Another intriguing fact that Google exposed on Google Webmaster Central Hangout is the confirmation the following:
You should still use the disavow file even though Google Penguin is real time now.
John Mueller explains that if you know your website was using shady unnatural ways to generate backlinks that didn’t respect Google’s guidelines, then you need to disavow those links that could harm your website. You need to evaluate your backlink profile to see where you stand.
The intriguing question somebody asked him was “Now that Penguin runs real time would it be correct to think that if we found a few bad links on our site and disavow them, we might see a ranking improvement relatively quickly? Also, does Penguin now just devalue those low-quality links, as opposed to punish you for them?”
John Mueller declares that Google looks at the web spam on the whole website to see what decision they have to take. If you have damaging links and use the disavow file, Google will find out that you don’t want those links to be associated with you.
Google Penguin is a webspam algorithm and it doesn’t focus only on links. It decreases search rankings for those sites that violate Google’s Webmaster guidelines.
On the final base, disavowing unnatural links helps Google Penguin and it is a recommended practice. One of the facts about SEO backed up by John Mueller and Gary Illyes:
Further reading:
SEO Myths That Can Kill Your Ranking Gains
4 Ways to Build Quality Backlinks to Your Website
8. URL Removal Tool Affects All Domain Variations
Take into consideration all variations when you want to use the URL removal tool, including the www/ non-www and http/ https. This is not a fix for canonicalization.
If you’re trying to take out the HTTP version after a site move, then both HTTP and HTTPs version will be removed and it is something you definitely don’t want.
It is recommended to use the URL removal only for urgent issues, not for site maintenance.
For example, if you want to remove a page from your website, you don’t have to use the URL removal tool because over time it will disappear automatically as Google will recrawl and reindex those pages.
Further reading:
Key SEO Mistakes That Will Kill Your Ranking Gains
9. No Specific Limit for Keyword Density in Content
A compelling question has been quite recently asked at a Google Webmaster Hangout about the limit for keyword density in content. John Mueller gave a straightforward answer regarding this topic:
We expect content to be written naturally, so focusing on keyword density is not a good use of your time. Focusing too much on keyword density makes it look like your content is unnatural.
Also, if you stuff your content with keywords, it makes it harder for users to read and understand what your content is about. On top of that, search engines will recognize that instantly and ignore all the keywords on that page.
Instead of concentrating on the limit of keyword density in content, you should aim your attention at making your content easy to read.
Use the next trick, for a change. Try and give somebody a phone call to read the content out loud. If they understand what you said in the first 2 minutes, then you’re safe, your content could pass as natural.
10. Content Duplication Penalty Doesn’t Exist
Content duplication has been a long-debated topic.
Let’s break an SEO myth: duplicate content will bring you a Google penalty. It’s not true. It doesn’t exist.
Back in 2013, Matt Cutts, the former head of Google’s Webspam team, said that:
In the worst-case scenario (with spam free content), Google may just ignore duplicate content. I wouldn’t stress about this unless the content that you have duplicated is spammy or keyword stuffing. – Matt Cutts
Google tries to offer relevant results for every search query. That is why, in most cases, the results that appear in SERP are filtered, so the information isn’t shown more than once. It is not helpful for users to see two pieces of the same data on the first page.
In 2014, during a Google Webmaster Central hangout John Mueller said that Google doesn’t have a duplicate content penalty.
We won’t demote a site for having a lot of duplicate content – John Mueller
It was confirmed on June 2016, when Andrey Lipattsev, senior Google search quality strategist, repeated and said content duplication penalty doesn’t exist.
But you could get a penalty if you manipulate Google’s guidelines and create content in an automated way, content that is scraped from multiple locations and that doesn’t serve another purpose than generating traffic.
11. Better to Add Structured Data Directly on the Page Than Using Data Highlighter
The submitted question on this topic was the following:
Is is better to use the Data Highlighter from Search Console or it is better to add up markup from the web page? What’s the best practice?
John Mueller says it’s a great option to use and try out structured data. There are sites with content that needs to have structured data and Data Highlighter is an easy way to test it out.
Structured data implementation on your page and not through the Data Highlighter is something that needs a lot of attention. Yet, it is the recommended procedure if you think of the long-term perspective. Markup data is available for everyone. You can see if it’s implemented correctly and, more importantly, you don’t have to worry about it.
If you marked your content for each element, then when you change your layout, for example, your data will be shown properly. As for Data Highlighter, it has to learn that change.
Further reading:
10 Analytical Tools For Link Building
27 On-Page SEO Checklist for Top Search Engine Performance
12. Have Relevant URL to Rank Your Images Higher in Google
Another important fact we’ve figured out from Google Webmaster Central Hangout is about the impact of the URL of an image in Google. Can it help Google understand better what an image is about?
And here is what Google had to say about this:
We do look at the URL for images specifically and we try to use that for ranking, but if the URL is irrelevant for that query and nobody is searching for that term, it doesn’t help Google.
The main point is that Google looks at the URL when ranking images. It is an important step because Google aims to deliver qualitative results. The correlation between the domain name and the query is highly significant. It is an effective SEO tactic you could apply.
SEO Friendly URLs can ultimately lead to higher perceived relevance, leading back to an edge from a SEO perspective. And we know this for a fact, from a study conducted by us on 34k keywords.
Let’s say that the title name of a URL is 123 and the image is about flowers. In this situation, the URL won’t disappear because of its irrelevant file name but there might be a problem here on how and for what query that image will show.
Even though Google is becoming smarter and smarter at object recognition, we still need to help it by offering descriptive information about that image (follow the guidelines from point 20 on this matter).
SEO Friendly URLs are a must when we talk about content, images or any other types of information on your website. It is important to offer a higher importance when naming your files; an SEO fact also backed up by Matt Cutts.
13. Low-Quality Pages Influence the Whole Domain Authority (DA)
Another interesting fact about SEO that Google debated on Google Webmaster Central Hangouts was whether low-quality pages influence Domain Authority.
The exact question addressed to Google’s representatives was “Can low-quality pages on the site affect the overall authority?”
In general, when the Google algorithm looks at a website, it also takes into consideration individual pages.
You can find more on this subject in the following Google Webmaster Hangout from March 2017 at 10:05
Let’s say you have a small website. If you have bad quality links on a website, then it could affect the domain authority. On the contrary, if you have a large site and only a sprinkle of bad pages, then the damage is not so big. Google understands that, in a bigger picture, those pages aren’t the main issue.
The main pain point is that low-quality pages affect the way Google views the website overall and this is something you should fix either way.
Find a solution for that pages. You could either remove them or try to improve their quality.
Also read: Top Google Ranking Factors Revealed in a Backlinko Study
14. Google Works With Over 150,000 Users and Webmasters Against Webspam
Among the most interesting facts about SEO that Google exposed on Google Webmaster Hangout is the way Google fought spammers in searches. Webspam is a persistent problem in our days. It fired an alarm signal when they saw the data in the most recent SEO statistics report, published afterward on the blog.
In 2016, there was a 32% increase of hacked sites compared to 2015.
But now, with the help of users, the problem of spam Google search results is being stopped. Last year, Google started working with users, besides webmasters, to improve the quality of SERP, to clean it of spammy websites and to make it a safe web environment.
The Webmaster Central stuff managed to work with over 150,000 website owners, webmasters and digital marketers against webspam. Also, users tackled webspam by helping Googlers.
Approximately 180,000 spam reports were submitted by users around the world, as we spotted on Google Webmaster Central Blog.
Besides making Google Penguin live and improving the algorithm to take actions against webspam, the webmasters from the spam department at Google performed manual reviews on sites’ structured data markup and took manual action on more than 10,000 sites that did not meet the quality guidelines and followed black hat SEO techniques.
15. Google Knows Your Work Place and It Isn’t Afraid to Admit It
Nowadays, we take our smartphone everywhere. It’s like it’s attached to our hand because we have a lot of information on the phone. We are logged into email, we have easily access to our accounts, to our contacts and we can easily find everything we want with a quick search on Google even if we walk or ride.
Google Assistant makes it easier to search and find what we want by offering us the possibility to make voice searches and personalize our content on Android, even when we’re offline.
With voice search feature, you can take lots of actions on your phone and on Google search:
- Find contacts call, email, send messages on social apps (Facebook, Twitter, etc).
- Set different actions on your personal phone (set alarm, turn on NFC, turn on Bluetooth, brightness, take a picture).
- Organize your calendar and see future events or schedule a meeting.
- Open apps you have on your smartphone.
- Listen to songs.
On a more complex level, you could ask the Google Assistant questions such as “what do I have to do today?”, “How long will it take to Book a table at East Restaurant”.
You can see the personalized touch on your phone if you search in Apps for a person, task, event and so on. You’ll find the results for that specific query from all your installed apps (as you can see in the picture above).
16. Personal Assistant Search Optimization (PASO) Might Be the Future of SEO
PASO comes from Personal Assistant Search Optimization. PASO might be the future of SEO because mobiles undergo a continuous development, now with the Google Assistant and the AMP feature for mobile optimization.
Google is trying to simplify the search process and to make it as easy as possible to get the information you want with fewer taps on your mobile device.
An important aspect why PASO might be the future is due to the fact that it offers personalized information for each user. It is an important aspect to keep in mind for getting improved search engine rankings.
Google Assistant is the quickest and easiest way to get your hands on the information. It doesn’t matter if you ask for a receipt, for explanations/definitions, for directions.
You ask and Google Assistant gives you the information in a heartbeat. It is more likely that the personal assistant result has a higher CTR, because it is tailored to your answer and it’s the only one you see comparing with the results from search engines. And besides that, it offers you suggestions related to your previous search.
If you want to get featured on Google Assistant you need to make sure you answer a question and your piece of content is optimized for humans.
You should follow the same steps as optimizing your content for ranking number zero. One way would be to ask the question and then answer it. This way you take a shot to get displayed in the answer box.
17. Google Search Console’s Metrics Get Fully Integrated Into Google Analytics
The Google Search Console helps webmasters manage the way their websites appear in SERP and Google Analytics helps users to integrate services like Webmaster, Adwords, and other tools to see results based on the source (paid search, organic search result).
Along the years, users could link Google Webmaster tools with Analytics, but they could see the results separately, in different sections. The Search Console results appeared in Google Analytics under the Acquisition section, as you can see in the next screenshot (this option is still available):
In the old version, you could only see how users came on the website and not what they did once they got there. Now, with the next improvement, webmasters can see all the metrics in one place and make decisions based on the data combined with those two tools.
You have access to the Acquisition, Behavior and Conversions data in one place. Due to this update, you could have many more possibilities to use the search data, as mentioned on the on Google Webmaster Central Blog:
- discover the most engaging landing pages that bring visitors through organic search;
- discover the landing pages with the highest engagement but with low attractions of visitors through organic search;
- detect the best ranking queries for each landing page;
- segment organic performance for each device in the new Device report.
18. Object Recognition Works in Combination with Image Optimization for Better Results in Google
The question that popped out in another Google Webmaster Central Hangout was:
Google is getting better and better at object recognition. Does this means that we no longer need to optimize our images description with descriptive files names, alt tags, title tags, etc.?
Having optimized images is a good indication for getting improved search engine rankings.
At this moment, we still need to optimize our images in order to rank as high as possible.
We won’t have to do that, maybe in the future. For the time being, having a relevant file name, alt description, title tag, caption on the image will help Google understand. Also, the text around the image helps Google understand what that page is about and what it needs to rank for.
Maybe in the future, 5-10 years from now, if things go very well, all that information won’t be necessary, but for now optimizing images helps object recognition to offer better results.
19. Google Site Search and Custom Search Within Site Will Become One
Do you know if Google Site Search is going to shut down and transfer to Custom Search? What’s the impact?
It’s about that special site search you can set up and search within your website. You can see how it looks like in the example below:
The Custom Search is quite similar.
21. HTTPs Ensures That the Information Users See Is What the Owner of the Site Provides
Another interesting SEO fact debated on Google Webmaster Central Hangout was about how relevant HTTPs is as a ranking factor for sites with information only.
The official response from Google is that:
HTTPs is relevant for any types of website. It doesn’t matter if the site has information only; it is still relevant. HTTPs doesn’t apply only for encrypting information like credits cards and passwords, but it also ensures that the information users see is what the owner of the website is providing.
An overly used technique to hack takes place in hotels, airports, cafes or, in general, other public spaces that offer free wi-fi access. They can take a look at the HTTP pages and add a tracking information or ads on those pages because the owner of the website failed to increase the security system. I bet that nobody wants spammy ads on their website, even if it’s informational only.
HTTPs helps users to see the content that was meant to be shown.
editor's pick
latest video
news via inbox
Nulla turp dis cursus. Integer liberos euismod pretium faucibua