The Week in Search is a weekly column produced by the Studio team to keep marketing professionals and ecommerce merchants up to date on changes in the search industry, and provide valuable context on what it all means. If you have questions or think we missed something, email us directly.
Google Session at Pubcon Covers a Lot of Topics
Gary Illyes, Google’s Webmaster Trends Analyst, appeared twice at Pubcon Pro Las Vegas this week, one keynote on Tuesday, October 8th, and a follow up Q&A interview on the 10th. Nathan Johns, a search quality analyst at Google, also shed some light on a few topics. Both covered a lot of information about Google’s updates this year and shared advice on how webmasters can ensure they’re in good standing with the big G.
Here are the major takeaways:
- Crawl rate doesn’t always spike before algorithm updates – SEOs are actively trying to predict when Google algorithm updates are coming and one metric they use to predict is Google’s crawl rate of their site. The assumption is that Google will increase crawl rate to index more information before it recatalogs it. Illyes said that crawl rate can increase before an algo update, but it’s not always the case.
- Google does not have an official EAT or YMYL score for your site – Gary Illyes said that EAT and YMYL are conceptual guidelines for their content quality raters to follow so they can apply objective analysis to what are actually algorithm components. Ultimately, Google is not keeping a grade somewhere for your EAT.
- “Content accuracy” is ranking factor – Gary Illyes said that because of YMYL grey areas, content accuracy is a ranking factor. How Google determines “content accuracy” is still up in the air, but in the past, they’ve said that they use signals to determine authority and accuracy.
- Webmasters don’t need to worry about 3rd party link analysis tools – Nathan Johns stated that you only need to worry about links reported in Google Search Console’s “Links” report. Even though GSC’s link report maxes out at 100,000, it shows a representative selection of links going to your site.
- Core algorithm updates were clarified – Illyes said that webmasters shouldn’t look at core algorithm updates through the reward/penalty lens. Instead, they should focus on high quality content and being relevant to their customers. Specifically, just because you perceive a loss in rank doesn’t mean your content sucks, it might just mean someone else is more relevant to the query.
- Google doesn’t require you to use the new rel= link attributes – but they’d love it if you did.
Studio Takeaway: We wish we were there.
Bing Improved Bingbot – Evergreen with Better Understanding of Javascript
Bing has announced some changes to Bingbot. First, they’re making their web crawler “evergreen” which basically means it will be automatically updated with the latest tech from the Microsoft Edge browser
They’ll also be improving their crawlers ability to parse and understand Javascript by adopting the Chromium web product in Microsoft Edge’s next iteration.
This means that there will be less discrepancy between Bing and Chrome browsers and how they crawl content going forward.
Studio Takeaway: We love that Bing is getting smart and simply utilizing Google tech to improve their tools. Why reinvent the wheel? By making Bing a more reliable search engine, there’s a chance that adoption rate will go up.
Google Adds Reviewer Image Product Schema
Now ecommerce merchants can apply structured data to customer submitted images. This week, Google added the <reviewer_images> product schema to its XML schema reference guide.
<reviewer_images> will allow you to showcase several different UGC images. Individual images are wrapped in <reviewer_image> (singular) elements. Here’s what it might look like:
Studio Takeaway: The more structured data, the better. There are a lot of ecommerce merchants that live and die by UGC, so giving them a way to clearly showcase that to Google is a no-brainer.
Google Search Console Now Showing Video Performance Dashboard
This week, the Google webmaster blog announced a new reporting feature for videos. Webmasters can now find out issues and see performance of their videos across Google Search, Google Video Search, and in the Discover feed on mobile. According to the post:
“These new tools should make it easier to understand how your videos perform on Search and to identify and fix video issues.”
Studio Takeaway: Video can be so impactful if done properly, but the drawback is that it’s a higher time investment, more overhead for gear, animation, actors, editing, and it’s always been somewhat difficult to report on how videos are performing in search. If you’re using videos as a marketing strategy, this is big news.
Other Interesting Links
- Google Testing Out Search Results without URLs: https://www.searchenginejournal.com/google-is-testing-search-results-without-urls/329465/
- #AskGoogleWebmasters Covers Structured Data: https://www.youtube.com/watch?v=kG8L_-fhkNw
- Google Says Links Don’t Expire, They Lose Importance Over Time: https://www.seroundtable.com/google-links-do-not-expire-28331.html
- Google Talks About Machine Written Content: https://www.seroundtable.com/machine-written-content-google-guidelines-28338.html
- Google Doesn’t Recommend Renting Expertise to Improve EAT: https://www.seroundtable.com/google-renting-expert-names-eat-core-algorithm-28342.html
About The Author: Clara Metcalf
More posts by Clara Metcalf