The Week in Search is a weekly column produced by the Studio team to keep marketing professionals and ecommerce merchants up to date on changes in the search industry, and provide valuable context on what it all means. If you have questions or think we missed something, email us directly.

Signs of Another Algorithm Update, Murmurs of Time on Site Being a Factor

Another week, another update. SEO Roundtable is reporting signs of another search algorithm update. This is being corroborated by signals in Mozcast as well as other SERP auditing tools.

Lots of folks in the SEO Roundtable forums are reporting some turbulence. To add another layer of obfuscation, one SEO Roundtable commeneter named Bill Lambert cryptically commented this last week: 

“There should be a big update dropping on the 29 or 30. Last weekends test rollout passed all tests. All that I can say is try to get your website visitors to spend as much time on your website as possible.”

Studio Takeaway: At this point, we should all be on the same page with Google algorithm updates: they happen all the time. At Studio, we regularly see fluctuations on a weekly and monthly basis for our clients, so ups and downs day-to-day or week-to-week is pretty normal. What is somewhat interesting about this week’s algorithm news is that it could have something to do with time on site stats. Average time on site is an indicator that users enjoy your site and find your content valuable, so it would make sense that this metric would come into play on some level. I suppose we’ll see how this update shakes out over the next couple of weeks and if Google decides to divulge anything about it.

Google vs. DuckDuckGo

DuckDuckGo has put out a scathing video about Google’s data privacy issues. You can watch it below:

In a rare admission that they actually have competition, Google has responded to the video 

Danny Sullivan, a Google Search liaison, put out some fiery tweets backing up Google’s search supremacy and pointing out weaknesses in DuckDuckGo’s algorithms. Oddly enough, he didn’t comment on their data privacy policies.

Studio Takeaway: Google typically doesn’t respond to stuff like this. They’ve dominated the search market for so long, acknowledging their competitors would hardly do more than give weight to their detractors. Maybe it’s because DuckDuckGo is the only major search platform that actually gained market share in the search industry over the last year.

New Study Shows Slightly Less Than 50% of Searches Result in a Click, Organic or Paid

SparkToro, Rand Fishkin’s new endeavor, is digging into data from a recent study by Jumpshot, a click-tracking company, that showed some surprising data in search clicks. 

According to the study, only 49% of searches actually result in a click, organic or paid. 45% of searches result in an organic click, and just under 5% result in paid clicks. 

It’s interesting to note that over the last 3 years, organic clicks has decreased 8% and paid clicks have increased by 2%.

Studio Takeaway: Google is actively chipping away at the click wall between a user’s question and a website’s answer. Their push for local results, rich snippets, and an array of new ad tools is only going to continue to drive the percentage of non-organic clicks down.

Google Search Gurus Talk About the State of Javascript SEO

Google has been actively making a push to ensure that Google is able to crawl and index Javascript-based content and websites. Because Javascript content renders on the client-side and isn’t located in a server, bots have had a difficult time making sense of that content. 

Javascript-based website builders are hot tech because they make it possible for users with little to no development or design knowledge to build gorgeous, fully-responsive sites. 

In a recent webmaster office hours Hangout, John Mueller and Martin Splitt talked about the status of Javascript SEO. A viewer asked whether Javascript SEO is still relevant now that Google has improved the way Googlebot indexes and analyzes Javascript content. 

Essentially, both Mueller and Splitt said that Javascript SEO isn’t going anywhere any time soon. In fact, it’s evolving to continue to better index Javascript sites. Googlebot is actively learning where its deficiencies are and how to improve them. 

Studio Takeaway: Tech companies have a responsibility to make their products more usable. For companies, like Volusion, that help people build websites, a Javascript-based tool has a lot of benefits: it’s fast, flexible, and scalable. 

Historically, there has been a friction point between companies developing the best tools for their users, and how effective those tools would be in the real world. UP until fairly recently, a Javascript-based site had a much harder time ranking that it does today, and Google is making an active push to improve that. 

If you want to see what Google currently recommends as far as Javascript SEO, go here.

You Don’t Have to Fix Structured Data Warnings

John Mueller has provided some insight into those pesky structured data warnings in your Search Console reports. For a long time, people have been wondering how much weight they should give those warnings. Well, now we have an answer. John Mueller’s responses, as reported by Search Engine Journal:

“So there are two things here. On the one hand, this is a warning. So it’s not an error that will… block everything.”

“It’s basically just saying… it would really help us to have an ID here. So if there were multiple versions of this product or multiple people selling the same product, we can group them together potentially.”

“…it’s not that we would like not process it at all. It’s not an error, it’s just a warning.”

“You don’t have to fix all warnings. A lot of sites have warnings with structured data and that’s perfectly fine.”

Studio Takeaway: At Studio, we’ve been dealing with a lot of questions revolving around structured data errors from our clients and what they should do about them. Our response has always been: “Google is sending warnings for these ‘missing’ fields even though they’re included on the site. GSC is simply looking for meta tag syntax that we don’t use on our fields. Those warnings can be ignored.”

We always assumed that since it was a warning, it wasn’t having a negative impact on search rankings, but we couldn’t “officially” back it up. Now, we can.

Bing Continuing to Improve their SEO Tools by Allowing You to Import Your Data from Google

More Bing news this week. You can now verify Bing Webmaster Tools by using existing Google Search Console data and import Google My Business listing data into your Bing Places profile.

Studio Takeaway: Smart move on behalf of Bing. Their tools are continually playing catchup with Google’s and I bet a lot of webmasters and site owners don’t bother with Bing tools because they’d essentially have to go through the process twice. Now you can simply import data that you’ve likely already got setup.

Other Interesting Links