Google’s Big Lie About the Impact of “Not Provided” Keyword Data

Fahrenheit Marketing
Fahrenheit Marketing in Design

I’m angry at Google and I’m not alone as the SEO community has started to realize the full impact of Google’s decision to block keyword referral data for searches done when people are signed into their Google account. What started as a minor inconvenience has turned into a nightmare for SEO companies that are now faced with the reality of inaccurate reporting and guessing games when it comes to analyzing their organic traffic.

Let me start with some statistics from FahrenheitMarketing.com to give you a better idea of the problem:

not provided keyword data

Oct. 28th – 7.1% of clicks from Google show up as not provided
Nov. 2nd – 37.2% of clicks from Google show up as not provided

When this was first announced, Matt Cutts stated on the record that the impact would be in the single digit percentages. While some of our clients are only seeing a single digit impact, other clients are seeing 20 percent or more and I’d love for Google to give me an idea of what to tell clients when they ask what organic keywords are generating traffic. Google’s decision undermines the credibility of online agencies because it takes away our ability to act and report on keyword data.

I would support Google’s decision if this was applied to their Adwords platform but Adwords users still have access to all keyword data essentially putting a price on privacy. Essentially this move is saying that if you want access to keyword data you have to pay for it by proxy through Adwords and this maneuvering is exactly what we’ve seen since Larry Page took on the role of CEO and the company shifted to a shareholders first policy.

Their behavior is the very definition of hypocrisy and I respectfully ask Google to reconsider their decision.

Fahrenheit Marketing is an Austin web design firm.


Fahrenheit’s SEO Philosophy

Fahrenheit Marketing
Fahrenheit Marketing in Design

In the latest Fahrenheit Marketing blog, SEO Specialist Will Gallahue takes us a bit further into Fahrenheit’s Search Engine Optimization philosophy. At Fahrenheit, SEO goes beyond back-links and keyword rich content and focuses on building an overall web presence through social sites and search engines.

In the video we highlight the fact that testing is a crucial part of the SEO process because an optimized site will not only receive more traffic but also better conversions. We refer to our emphasis on testing with the phrase “A click is not a client” highlighting the fact that an increase in traffic won’t necessarily result in an increase in conversions.

Interested in learning more?  Subscribe to our YouTube channel and start receiving new, relevant and helpful videos every time you log in.

Do you have any video ideas? Have a suggestion for a topic you would like to see us cover? Tell us about it by leaving a comment below.

Fahrenheit Marketing is an Austin web design firm.


100th Post: A Click is Not a Client (Video)

Fahrenheit Marketing
Fahrenheit Marketing in Design

In Fahrenheit’s latest video blog “A Click is Not a Client”, SEO Specialist Will Gallahue briefly runs though the Fahrenheit Marketing Search Engine Optimization philosophy. When evaluating the success of a website, there are many factors to consider beyond traffic volume. Many companies make the mistake of solely basing performance on consumer traffic and web site views, failing to realize that a click is not a client. One of the most important elements to consider when measuring success is your website’s click-to-conversion rate.

View this video for more information on the Fahrenheit Marketing process.

Interested in learning more?  Subscribe to our YouTube channel and start receiving new, relevant and helpful videos every time you log in.

Do you have any video ideas? Have a suggestion for a topic you would like to see us cover? Tell us about it by leaving a comment below.


Are Subdomains the Answer for Panda Penalties?

Fahrenheit Marketing
Fahrenheit Marketing in Design

A new article in the Wall Street Journal suggests that one site owner has found a remedy for sites that have seen a huge loss of traffic from the Panda update: subdomains. According to the article, Hubpages.com lost 50 percent of its traffic following the algorithm change, and their management team was frustrated after tighter editorial changes and other fixes failed to yield any results.

While they were combing through the site, they noticed that some pages were incorrectly indexed as subdomains but that those same pages were seemingly unaffected by the changes. They contacted Google with their findings, and Google responded last month by suggesting they try subdomains for content. Hubpages started using subdomains for some of their authors and noticed a recovery in traffic to pre-Panda level. Now the site plans to roll out the change site-wide with each user receiving a subdomain similar to having a blog on Tumblr or WordPress.

It remains to be seen whether the change will work but I think this is definitely a viable strategy if you own a site that has a large amount of user generated content. Due to the fact that UGC can vary significantly in quality, separating poor authors with those that create quality work should in theory result in more traffic to authors whose work was dragged down by association. However if your site is mainly composed of poor quality content, you will likely see no real changes from redoing your site architecture because low quality content is still low quality content no matter where you put it.

It will be interesting to see if other article sites like Ezinearticles implement or at least test some form of subdomains. This is by no means an easy task because of the redirects and time that is involved but for businesses that have a UGC business model, this may be a necessary change to restore traffic.


SEO Tools Review: Optimize Content with Inbound Writer

Fahrenheit Marketing
Fahrenheit Marketing in Design

Our company generates hundreds of pages of content each month for clients and our own site. We write blog posts and informative articles as well as develop on-site content. However, our ability to measure the effectiveness of an individual piece has largely been at the page level, which combines the content with meta tags, URL structure and other components. However, at this level, good page settings can overshadow problems in the content itself, which is why I was intrigued by a new web based application called Inbound Writer.

The program analyzes your content in real time and assigns a score based on how well you use the keywords you want to pursue. The use of keywords is more than just instances–it measures placement and how well you use long tail and head terms in your document.

I started by inserting an article we created for a client about inaccurate breathalyzer readings. The next step involved choosing three keywords we wanted to target/attempt to rank for within the document; for this example I chose “breathalyzer,” “DWI,” and “alcohol.” Afterwards the program asked me to select 1 to 3 sites that are similar in content or rank for those terms.

It then loaded the content, and I added the title from the client’s page and the program assigned a score of 58 which was quite surprising. The main problems that the program identified was that the terms I wanted to rank for weren’t near the beginning of the title tag or the start of the body text. The tool was correct on the body text because the page itself uses a long introduction before starts to analyze potential problems with breathalyzers. However I disagree with their title tag assessment because I think its better for organic CTR in this instance to start with “Why” rather than a keyword.

On the right hand side was a list of my target phrases and other related keywords and how many times each one had been used. The more terms you bring in and the more times you use your main terms, the higher the quality score. The only thing authors need to be careful about is writing with a certain score they want to attain and sacrificing the quality by overloading search friendly content and ignoring readability. If Inbound Writer’s algorithm was able to measure readability, composition and optimization, that would make this tool a must have.

The main problem I had was that the program didn’t have a keyword density feature. It would be nice to see the density because that is just as important as the number of instances. Additionally there is no spell check feature which might cause problems if a writer is developing content in that program then porting it directly to an article or blog post. The program runs in Flash so it is not compatible on the iPad or for people who have Flash disabled in their browsers.

The program gives users 8 free articles per month and if you need to optimize more content, their unlimited plan is just $20/month. Overall the program is a promising tool for content writers and SEO professionals but needs a little more polish before it’s ready for more widespread use.

Update: See comments below for some more information about the product