Page Experience Signals to Roll Out in May 2021
Core Web Vitals (loading performance, interactivity, and visual stability) are three important signals that Google uses to help determine if a webpage is delivering a good user experience. These signals are combined with several other factors (i.e. mobile friendliness, safe browsing, secure browsing (HTTPS), and no intrusive interstitials) to make up Page Experience Signals.
Google announced in May that these signals would have an impact on a site’s ranking in Google search in the future. And now we know when: May 2021. Google is also testing a “visual indicator” to highlight pages with what the search engine considers to be a “great page experience.”
If you’re not considering Core Web Vitals and other page experience signals in your site’s performance, it’s time to do so!
Read more here and learn more about what factors are part of page experience signals here.
It’s Not “Mobile-First” – It’s Really “Mobile-Only”
As part of the ongoing conversation about Google’s shift to mobile-first indexing, there has been some confusion about the roll that desktop will play in indexation moving forward.
Google’s John Mueller spoke at the marketing conference PubCon in mid-October and reminded attendees that Google will only index the mobile version of your site (if you have one) – anything on the desktop that’s not on the mobile version will not be indexed.
Google will not look at the desktop version at all in most cases (and when it does, it’s checking for spam or other manipulations).
It should be noted that if your site is desktop-only, Google will still crawl it with the mobile crawler.
Read more here.
John also spoke about the future of SEO during the conference, and he talked about the importance of user experience (UX) testing – particularly testing real sites with real content. With the emphasis on page experience signals, as mentioned above, it’s important to test using multiple tools and make sure your results are repeatable. Learn more here.
Slow Pages Can Hurt Faster URLs
We know that site speed is already a ranking factor, and metrics like first contentful paint (FCP) will become even more important with the focus on page experience signals mentioned above.
In a recent Google Office Hours video, John Mueller was asked if slow pages could have an impact on the site as a whole. John said that Google’s algorithms try to be granular, but they may not have data for every URL on your site. If Google can’t recognize different parts of your site and group similar pages together (John used a forum on the site as an example), then it will likely take an aggregate score across the full site.
Core Web Vitals are not yet used in search rankings, but if you have groups of pages that are likely to be slower, organizing your site in a way that makes it easy for Google to separate them from the rest could help.
Read more here.
News from Google’s “Search On 2020” Event
In mid-October, Google held a livestream event called “Search On” to talk about the ways the search engine is using artificial intelligence. You can read all of the articles about the event here, but there are a few announcements that drew the attention of those in the SEO community:
- BERT (a natural language processing model) is now used on almost all English queries
- Google is getting better at understanding subtopics
- The search engine is better able to understand more “key moments” in videos
- Google can now index “passages” or just portions of a page
This announcement on passage indexing has garnered some of the biggest interest. The idea here is that if you’re searching for one small piece of information, Google can find it for you – even if it’s just one small part of a much bigger article. The goal is to provide more relevant search results, but it could lead to a big shake-up for some search queries, with pages that have not ranked previously possibly ranking for keywords that they were not specifically focused on.
Google has clarified that this is a ranking change, not an indexing change. Google’s John Mueller has also said that you don’t need to optimize for this change but being clear with your subheadings and structuring your page in a way that’s easy for Google (and visitors) to understand may be more important than ever.
Here are several articles with more details on “Search On” and passage ranking:
- https://www.seroundtable.com/google-search-announcements-30279.html
- https://searchengineland.com/how-google-indexes-passages-of-a-page-and-what-it-means-for-seos-342215
- https://www.seroundtable.com/google-on-passage-indexing-tips-30360.html
How Search Works
If you have some time (about an hour), Google has made a short film called Trillions of Questions, No Easy Answers: A (home) movie about how Google Search works. It provides a lot of information on Google’s changes and challenges over the years, including interviews with Google employees.
How Google Handles Duplicate Content
We all know (or we should!) that duplicate content should always be avoided, if possible. But how does Google know that pages have duplicate content? In a recent podcast, Google’s Gary Illyes provided a deep explanation of how the search engine detects duplication, builds clusters of those pages, and then chooses a “leader page” of the cluster.
To detect the duplication, the algorithm doesn’t compare all the words on the page – it first tries to eliminate any boilerplate content (navigation, footer, etc.), then reduces what remains to a checksum, which is a small piece of data that’s used as a sort of fingerprint for the page. The checksums are compared to find both exact and near duplicates.
Once those duplicates are found and clustered, “over 20 signals” are used to determine which page is picked at the canonical. Gary didn’t outline exact which signals are used, but did mention the content, page rank, HTTPS, if the page is in a sitemap, redirects, and rel=canonical. The page that is chosen as the canonical is what will appear in the search results.
Read more from the discussion and find a link to the podcast here.
Every Serious Website Should Have an XML Sitemap
Having an automatic XML sitemap file is “a minimal baseline for any serious website,” according to John Mueller.
In response to a tweet about the request indexing tool – which is (as of this writing) still not working – a person mentioned difficulty they were facing getting Google to re-crawl updated pages without the tool since their sitemap did not automatically update. John suggested that the better solution was to fix the sitemap so that it updates automatically rather than depend on manual submissions.
See the discussion here.
When Translating a Site, Make Sure All Relevant Words Are Translated
A reminder from Google’s Danny Sullivan: If you’re translating a site to a different language, don’t forget to update the page title and other relevant sections too. Otherwise, you’ll see the original language title in the search results.
When asked, Danny said that it “makes sense” to translate the words in URLs as well, if you feel that the words in the URL help users.
Read more here.
Microsoft & Bing News
Here’s a roundup of some recent news:
- Bing has relaunched Site Explorer, rebuilding it from the ground up. See what’s included here.
- Microsoft has also made Microsoft Clarity, a free UX tool, available to a wider audience. Read the blog article here.