RetailMeNot Sees it Stock Get Whacked By Google Algorithm Update

retailmenot_logo_lg-300x91

RetailMeNot is see its stock get hit hard by the recent Google algorithm change.

Barry Schwartz wrote on Search Engine Land:

On Tuesday, Google pushed out their revised Panda Algorithm code-named Panda 4.0. Panda is designed to remove low-quality content from Google’s search results. Yesterday, we published an early Winners & Losers report on which large sites were impacted the most both in a positive and negative way.

RetailMeNot was one of the larger sites on the losers list, where Searchmetrics reported an approximate 33% loss in organic search visibility after Panda 4.0 was released. RetailMeNot was quick to issue a statement responding to these reports saying they “greatly overstate the impact on RetailMeNot.com.”

The company statement released this morning reads:

The company believes these reports greatly overstate the impact on RetailMeNot.com. Over RetailMeNot’s history, search engines have periodically implemented algorithm changes that have caused traffic to fluctuate. It is too early to judge any potential impact of the latest Google algorithm change. While RetailMeNot’s traffic with Google continues to grow year-over-year, the company has experienced some shift in rankings and traffic. The company continues to believe its focus on content quality and user experience will continue to help grow the business, enable consumers to save money and drive retailer sales.

Read the full story here

The stock which trades under the symbol SALE, was down close to 19 % on Thursday.

Bloomberg covered the story :

Google Inc. (GOOG) is trying to push out spammy sites from its search results, and RetailMeNot Inc.’s stock is paying the price.

RetailMeNot shares plunged a record 19 percent today after a report said that recent changes in Google’s search algorithm made results leading to the Web couponer 33 percent less visible. Searchmetrics, a provider of digital-marketing software, wrote a list of the winners and losers, with RetailMeNot as one of the biggest losers.

“These reports greatly overstate the impact on RetailMeNot.com,” the Austin, Texas-based company said in a statement today about the changes. “Over RetailMeNot’s history, search engines have periodically implemented algorithm changes that have caused traffic to fluctuate. It is too early to judge any potential impact of the latest Google algorithm change.”

Whether or not the threat is real, the plunge in RetailMeNot (SALE)’s shares underscores the control Google has over Internet traffic and the risks faced by companies that rely on the search engine for users. Google captured 68 percent of U.S. search queries in March, more than triple the market share for Microsoft Corp., according to ComScore Inc.

Read the full story here

This shows once again that if a business relies on Google or even if it is perceived to rely on Google, that things can go down hill fast with one little change from the engineers in Mountain View.

Google Extends “Not Provided” to Paid Search

google

Google is making a change that will bring paid search in line with organic search when it comes to relaying referrer information. Now when people using secure search click on AdWords ads, they will not have their search query show up in the referrer string, this info will no longer be available in analytics packages and third party providers of related software.

Ginny Marvin wrote on Search Engine Land:

As suspected, Google is moving to secure search for clicks on paid search ads. In an announcement posted by Paul Feng, Product Management Director, AdWords on the Ads Developer Blog, the company states,

“Today, we are extending our efforts to keep search secure by removing the query from the referrer on ad clicks originating from SSL searches on Google.com.

Advertisers will continue to have access to useful data to optimize and improve their campaigns and landing pages.  For example, you can access detailed information in the AdWords search terms report and the Google Webmaster Tools Search Queries report.”

The search term report still “lets you see search queries that generated ad clicks along with key performance data”.  The search queries report in Google Webmaster Tools will continue to show aggregate information about the top 2,000 queries that generated organic clicks.

If you’re using search query strings (the terms users typed in before clicking on an AdWords ad) for reporting, in dynamic landing pages or automated keyword generation and expansion, Google recommends the following:

Read the full article here

George Michie also wrote an article and explained why its not that big of a deal.

This is an annoyance, but in the great scheme of things, it’s not a major problem. I’ll explain more, but first a little background.

At SMX West in March, Google search chief Amit Singhal directly acknowledged the company recognized the contradiction in providing advertisers with data that non-paying SEOs weren’t receiving. He said the lack of parity was a concern that Google understood and would address. That got some marketers and publishers excited that user query data would soon come back to web analytics packages.

Not so. Instead, we now know that Google’s plans for achieving parity involve blinding paid search folks to query data.

Google announced today that, going forward, an increasing amount of secure search would pass referrer strings to advertisers that omit the user’s search query, so this will appear as “(not provided)” in analytics reports. Over the next month, the percentage of search queries omitted for paid search traffic will reach organic levels, representing nearly all user queries.

Google gave us a courtesy heads up that this announcement was coming, which we appreciate. Having had a chance to consider the situation, my thinking now is pretty much the same as it was back when Singal spoke and the speculation began.

Read the full article here

Microsoft and the Bing Entity Engine

Frederic Lardinois wrote a piece on Tech Crunch tonight about Microsoft and their plan for their entity engine. The entity engine is Microsoft’s answer to the Google Knowledge Graph. These products are not the most desirable from a web developer standpoint. Google and Bing would both like to give people using their search engine a direct answer, keep them on their site and lessen the need to click through to a website. This will certainly not affect all niches, but someone running something like a  historical website  may see their traffic affected.

Some have come out and blasted Google as nothing more than the very content scrapers they claim to despise. Google addressed this at Search Marketing Expo West a few weeks back.  Google’s head of search Amit Singhal spoke with Danny Sullivan of Search Engine Land.

From the interview:

Amit Singhal, brought up the tweet that went viral showing how Google is considered by many as a massive scraper site. Amit addressed the concern with an analogy.

Amit equated Google’s knowledge graph to a swiss army knife while equating the publishers content as specific tools, such as “corkscrews,” “screw drivers” and other specialty tools.

Amit explained that while Google’s goal is to give searchers a quick answer, there is no substitute for the searcher to do deeper research by clicking into the sources provided in the Google search results. He understood that many searchers would likely just want the quick answer but there is still a need for a deeper dive into the sites and publishers providing that content.

From the Tech Crunch article:

Both Microsoft and Google have long worked on getting users answers without having to click through a number of websites. At Google, the current result of this is the Knowledge Graph and at Bing, it’s the company’s entity engine (previously referred to internally as Satori). Both search engines currently offer very similar experiences related to these engines. Search for “Albert Einstein” in Google and Bing, for example, and the right sidebar will give you plenty of information about him. But both companies have a different approach to how they plan to push these entities forward.

Earlier this week, I sat down with the lead of Microsoft’s Bing Experiences group, Derrick Connell, to discuss the state of entities in Bing and the company’s vision for the future of entities in its search engines. Microsoft clearly has big plans for using entities in Bing and products that rely on it; the company plans to open up a part of this entity engine so more third-party sites will be able to highlight some of their features on Bing.

1% of Search Advertisers Drive 82 % of Clicks

Laurie Sullivan wrote on MediaPost that only 1% of search advertisers drive 82% of clicks.

If you believe the results of a recent study examining Google AdWords clicks, it may seem that most marketers still don’t know how to run a successful paid-search campaign. While many think they do, only the top 1% of search advertisers in the categories of Retail, Hotels & Resorts, and Computer Hardware drive an average of 82% of search clicks in each category across the United States, the United Kingdom and France.

The study — conducted in January by AdGooroo, a Kantar Media company — examined search advertising activity in 10 business categories on Google AdWords in the U.S., the U.K., and France. With an average of 1% of brand marketers truly successful in search engine marketing, what will it take to distribute the clicks a little more evenly?

“The only way to move the needle on this 1% is for the other 99% of advertisers to raise their game and get better at search,” said AdGooroo CEO Rich Stokes. “This figure reflects the dominance of the top 1% of advertisers. Google runs a competitive marketplace with its ads, but that top 1% has become so effective they command 80% of the clicks by executing effective bid management, leveraging expansive keyword portfolios, providing a solid consumer experience, and more.”

I know that I talk to a number of small businesses that got discouraged when trying to run a Google AdWords campaign, they were lost by the quality score and the minimum bid system. I think you have to read, test, read and test some more and always keep tinkering with your Adwords strategy to increase your chances for a successful campaign.

Matt Cutts Talks About a Softer Panda and a Boost for Sites Using SSL

google

Matt Cutts had a few things to say at  SMX West regarding potential future moves at Google.

Barry Schwartz at Search Engine Roundtable wrote that Cutts would like to give a boost to sites using SSL.

At SMX West Matt Cutts gave the attendees a few tidbits, one of those items was that making your site secure, encrypted, i.e. SSL enabled, is an important trend for 2014.

At the end of the session, I asked Matt if this means Google is looking to give sites that enable SSL a ranking boost. Matt Cutts shrugged his shoulders and explained that if it was his choice, he would make it so. But he said, it is far from happening and there are people at Google that do not want this to happen.

Schwartz also wrote an article on Search Engine Land, that discussed a softer, more friendly Panda.

Google’s head of search spam, Matt Cutts, announced at Search Marketing Expo that his search team is working the “next generation” Panda update that would appear to many as being softer.

Cutts explained that this new Panda update should have a direct impact on helping small businesses do better.

One Googler on his team is specifically working on ways to help small web sites and businesses do better in the Google search results. This next generation update to Panda is one specific algorithmic change that should have a positive impact on the smaller businesses.

How both of these play out remains to be seen but it shows there is no resting when trying to keep up with Google. They zig then webmasters have to zag to keep up with everything that is going on in the wild world of search.