Google

After More than a Month, the Impact of Panda 4.2 Remains a Mystery

After More than a Month, the Impact of Panda 4.2 Remains a Mystery

Back in July, Google began to roll out Panda 4.2: the first update to the Panda algorithm in almost 10 months, since version 4.1 was released last September. Panda 4.2 is more of a refresh of the existing algorithm than a complete update, and will take several months to roll out completely. Google says that Panda 4.2 has affected 2% to 3% of English language queries–approximately 36 million searches. All of these factors mean that very few people have noticed an immediate or significant fluctuation in their organic rankings.

With only certain pages on affected sites currently experiencing changes, it’s difficult to determine the exact ranking factors impacted by Panda 4.2. And with Google maintaining their typical evasiveness on the specifics of algorithm updates, until more data is available, our best chance at understanding Panda 4.2 is to look at the Panda algorithm as a whole and infer from updates that have come before.

The Panda Algorithm: An Overview

A helpful breakdown of Panda algorithm updates from Search Engine Land shows that Panda 4.2 is, in fact, the 30th update since Panda first rolled out In February 2011. Google’s intent with the original Panda algorithm was to serve users with search results most directly relevant to their queries. In order to ensure the “best” sites ranked highly, Panda weeded out sites with irrelevant or threadbare content that was optimized to simply rank for targeted keywords. By penalizing so-called spammy sites, Panda rewarded websites with highly relevant, quality content that was not only optimized, but also engaged the user and answered their questions.

With Panda, Google’s message to marketers is clear: the content on our websites should be thorough and authoritative, and aim to create a helpful user experience. Moz’s Rand Fishkin has often said that content should go above and beyond unique or original, and aim to “10 times better than anything out there.” Each subsequent Panda update has continued to emphasize the importance of relevancy to the user, and the Penguin algorithm also shows how much emphasis Google places on this.

Moz’s Google Algorithm Change History says that the immediate impact of Panda 4.2 is unclear, and even more than a month later, the nature and reach of the update remains a mystery. Because it is taking so long for Panda 4.2 to roll out, only a few pages on affected sites are experiencing changes at a time, and many people have yet to report any change in their rankings whatsoever.

Google sites technical reasons for the speed of the roll out, but many webmasters are being left confused and frustrated. At any rate, it will be months before affected sites experience the full impact of Panda 4.2, once the update has completely rolled out.

Responding to Panda 4.2

As with any Panda update, it is already too late to apply any changes that will have an immediate positive effect on your site. According to the SEM Post, “these updates have a cut-off date and any changes made after this date will be applied to the next refresh or update.” Unlike other Google algorithms that are “everflux”–continually altering search results–Panda algorithms require actual updates. An update occurs, and rankings rise or fall depending on the factors emphasized by the update. But the changes you make in response to the impacted rankings aren’t factored in until the next Panda update. Whatever changes you make in response to Panda 4.2 won’t be counted in your favor until (the hypothetically titled) Panda 4.3.

This is why the slow rollout of Panda 4.2 presents so many concerns: it will take months before you’re able to see how you were affected how you need to respond. On the flipside, it will also take months to see if the changes you made after Panda 4.1 had a positive impact on your site.

When the update rolls out completely, assuming (as most of the speculation thus far does) that Panda 4.2 continues the trajectory of previous Panda updates, your site’s organic performance after your last round of updates should indicate if you’re on the right track. If your site experiences an increase in rankings after Panda 4.2, then continue with a strategy that expands upon the updates you made with Panda 4.1. If your site is penalized once Panda 4.2 completely rolls out, then consider revising your strategy.

At this point, with all the  mystery surrounding Panda 4.2, all we can do is wait and see. But if four years of Panda updates have shown us anything, it is that SEO has become less about a formula to optimize for search engines and more about a holistic strategy that seeks to directly impact and benefit the user. Even if it takes months before we fully understand Panda 4.2, as long as you continue to create outstanding content and respond to the current landscape of your vertical, you are well positioned to prepare yourself for future Panda updates.

European Commission Prepares Formal Antitrust Charges Against Google

European Commission Prepares Formal Antitrust Charges Against Google

The European Commission appears to be preparing to file formal antitrust charges against Google, according to the Wall Street Journal. As the European Union’s top antitrust authority, the Commission wields tremendous power in implementing and enforcing legislation that restricts companies from forming monopolies or depriving competitors of a fair share of the market.

While Google is an American company, the search engine maintains a strong web presence in European search markets–even more than in the United States. This makes Google subject to European business regulations, and entitles entities such as the European Commission to investigate any practices that are perceived to violate or fall out of line with those regulations.

This latest episode is not the first time the European Commission has explored legal action against Google, but it does mark the first formal charges filed against the company. While it is still possible that Google and the Commission may reach a settlement, it seems increasingly likely that the case will finally move forward.

Google Vs. the European Commission

Back in December, Fuze reported the latest in the antitrust investigation which had been ongoing for more than four years. On November 27, 2014, the European Parliament voted to separate Google’s search engine from its advertising businesses. The vote was symbolic–the Parliament has no legal authority to dissolve an American company’s properties–but it demonstrated an overall legislative attitude that placed pressure on the Commission to act. It falls well within the European Commission’s power to separate Google’s businesses, and the November vote had the potential to shape the course of the Commission’s investigation.

The case began in November 2010, when various European companies began launching confidential complaints against Google for its exertion of dominance over the search market. Amongst those complaints were allegations that Google had effectively formed an internet monopoly by emphasizing its own properties or those of its advertising partners’ in search results and limiting the web presence of competitor properties.

In light of the complaints, the European Commission was tasked with determining whether or not Google should “unbundle” their search engine from their other businesses to open search rankings up to competitors’ voices. Throughout the investigation, numerous solutions and settlements were proposed to avoid formal charges, with the European Commission even working with Google to placate the search engine’s opponents. Still, most proposals were dismissed by complainants as too lenient, and the investigation remained open with no real resolution in sight.

Cold Case? New Leadership Suggests New Consequences for Google

Throughout the investigation thus far, the European Commission has seemed to be on Google’s side. Jaoquin Almunia, Vice President of the Commission, was staunchly in favor of speedy settlements that avoided legal proceedings and benefited all shareholders. In one 2012 statement, he was quoted as saying, “these fast-moving markets would benefit from a quick resolution of the competition issues identified. Restoring competition swiftly to the benefit of users at an early stage is always preferable to lengthy proceedings.” In essence, Almunia sought to reach a compromise that satisfied all parties involved, including Google.

Propositions for a settlement reached by Google and Almunia often saw the company’s search and advertising businesses remaining intact. These solutions, the most notoriously attacked of which was known as rival links, often looked the same: Google retained its authority over search results, allowing them to prominently display their own advertising partners’ sponsored links, while relegating a fixed number of placements to be bid on by competitors such as Amazon. Studies commissioned by complainants in the investigation found these settlements to provide little additional value to competitors, and Almunia was criticized for his complacency in Google’s perceived monopoly.

Now, however, as antipathy grows towards Google in Europe, the almost five-year-long investigation is coming to a head. In March, the Wall Street Journal published a previously secret document that was part of a Federal Trade Commission (FTC) investigation into Google’s business practices. The FTC is the United States’ equivalent of the European Commission, and while they have never filed charges against Google, they did investigate the company back in 2012. This document is from that investigation, and alleges that Google repurposed content from other publishers (such as Yelp, Amazon, and TripAdvisor), and penalized them in search results when they prohibited Google from using their content.

The document released by the Wall Street Journal indicated that the FTC echoes much of the European Commission’s concerns, thus renewing the focus on Google’s practices. Did this document incite the Commission’s actions, or did the European Parliament’s symbolic vote to dissolve Google’s businesses actually hold sway over legislators’ decisions?

Late last year, Margrethe Vestager succeeded Almunia as Vice President of the European Commission, and under this new leadership, the Commission’s stance against Google appears to be hardening. Business Insider reports that the Commission is now asking for various companies’ confidential complaints against Google to be made public. According to the article,

“The specific document the EU is reportedly preparing is known as a Statement of Objections. Once filed, it could kick off several years more of deeper investigations, counterstatements, and settlement discussions. If the company and the EU cannot reach a settlement, the EU could then issue penalties, including fines and restrictions on Google’s behavior.”

The very fact that the European Commission is requesting to publish the complaints indicates they are in the final stages of gathering documentation to file charges. What lies ahead for Google remains uncertain. To reach a settlement at this point would require Google to make drastic changes to Universal Search, and any proposal would be met with even more intense scrutiny.

If they cannot settle, Google may be facing fines of up to 10% of their annual revenue–roughly $6 billion based on last year’s numbers. They would then have the right to appeal, but given the widespread sentiments against Google throughout the European Union, there is no guarantee they would win. By any measure, the conflict between Google and European search markets is far from over.

Mobile Search to Become a Ranking Factor in Google’s Algorithm

Mobile Search to Become a Ranking Factor in Google’s Algorithm

Google just announced two major updates to the search algorithm that will mean better rankings for mobile-friendly sites. Beginning on April 21, 2015, Google will start to include mobile-friendly factors in its rankings, allowing sites that are optimized for mobile search to rank higher for searches done on a mobile device. In addition, starting immediately, Android apps that are indexed by Google through App Indexing will also rank better in mobile search.

Google expects these change to have a “significant impact” on mobile search results globally and across all languages. They are postponing the mobile-friendly ranking update until April to allow webmasters enough time to prepare. Sites that are not currently optimized for mobile search must now, more than ever, pay attention to Google’s increased emphasis on mobile optimization.

As it is, these two updates indicate a rapid and important shift on the search engine’s part in responding to the growing number of searches that come from smart phones and tablets. As recently as January, Google began sending warnings to webmasters whose sites were not optimized for mobile, yet there was no indication that this was soon to become a ranking factor.

Still, this news does not come as a surprise to experts who have already been following the sustained uptake in mobile search in the last few years. Using mobility as a ranking factor seems like a natural progression for Google as we finally enter what many believe to be the “year of mobile.”

Have We Entered the Year of Mobile?

Digital marketing experts across the web are proclaiming 2015 the year of mobile. It’s been obvious for some time that trends in mobile search are on the rise: the percentage of web traffic driven by users on mobile devices has increased steadily over recent years, with consistent and sustained growth indicating that these patterns are likely to continue. In fact, VentureBeat has predicted that by 2016, mobile search queries will account for 61 percent of all site visits on the web.

As marketers, we’ve been hearing for a while now that each new year is the “year of mobile”, but these latest updates from Google mark a definitive new phase in web marketing. Not only are the trends speaking to themselves, but Google is responding by further integrating user behavior into search results to account for people searching more frequently on their mobile devices. The fact that Google has now incorporated mobile-friendly factors into its rankings, after years of speculation about the future of mobile search, proves that we are in fact in the year of mobile.

Mobile Tips for the Current Search Landscape

User experience is crucial to an effective mobile SEO strategy: ensuring that the content on your site displays correctly, not only on desktop, but across mobile devices. Neatly displayed, easily accessible content is key to mobile success. Think about how people are searching using their phones. They are on the go. They need quick answers. They need rich, visually stimulating content that won’t require them to slow in their pace to navigate dense text or a clunky layout.

To accommodate users without having to create multiple versions of the website, most sites turn to responsive design. A website built with a responsive design can “respond” to the type of device from which it is accessed. It’s layout can seamlessly adapt to display correctly on that device–be it a desktop computer, tablet, or smartphone. Content renders in different sizes and layouts depending on the size and orientation of the screen, so that users can interact with the same content regardless of their device. Content is easy to manage, and this eliminates the need to create multiple pages for multiple domains, thus keeping the page and link authority of the original site.

A responsive design is essential to the mobile user experience. When designing across devices, you should make sure that your site navigation is clean and not overly complex, and  prominently display key contact and location information. Your content should be easily digestible, and should incorporate a variety of both written and multimedia content.

Building a website around mobile is no longer just an option. A mobile-friendly, responsive design should be at the forefront of your web strategy if you want to compete in the growing mobile search market. Some actually argue that mobile optimization is more important than your focus on desktop. Forbes contributor Jayson DeMers even suggests that mobile should be your priority. According to DeMers, “the industry standard used to be to create a design that worked on standard computers, then to ensure it was accessible via mobile. However, many leading design experts are now suggesting that good design means concentrating on mobile first, and desktop second.”

It is one thing to hear about how mobile is changing the search landscape as a whole, but what does that mean to your own site? How much should mobile search inform your web strategy? In all likelihood, the answer is probably “a lot”, but don’t just listen to the industry’s take on the issue.

In order to truly understand how mobile search affects your site, and how to leverage this growing market, spend some time gaining insight into your own audience. Dive into Google Analytics to see how many of your visitors access your site from a mobile device. Do they use smart phones? How long do they stay on your site from their mobile device, and what kind of content are they consuming? Analyzing this data will help you serve your audience’s mobile needs by understanding their behavior and tailoring your content to provide the most value to them.

Knowledge is Power: Preparing for the Growing Importance of Google Knowledge Graph

Knowledge is Power: Preparing for the Growing Importance of Google Knowledge Graph

Author and tech journalist, Steven Levy, has released an in-depth study on the evolution of Google over the years, and the continued advancements that have allowed the search engine to compete in the changing landscape of search–in particular, the advent of mobile search’s growing influence over desktop search. A key finding of this report involves Google’s Knowledge Graph; the product that aggregates information from various publishers to produce a direct answer to search queries that appears before  the SERPs. The Knowledge Graph does not appear for every query, but according to Levy’s report, Google has alluded that as much as 25% of all searches may yield a Knowledge Graph result.

While we aren’t able to anticipate which queries Knowledge Graph shows up for, this 25% approximation is substantial enough to demonstrate a change in the way Google is thinking about how people search.

Entities, Not Keywords; Things, Not Strings

The Knowledge Graph is appearing for enough people to show that Google is taking user intent seriously: it is evolving to become more intuitive–to draw connections between the way humans think and the way humans search, and to provide results that reflect that. In essence, Knowledge Graph is representative of the overall shift to semantic search that became most apparent after the Hummingbird update in 2013. With semantic, or conversational search, Google aims to provide results directly relevant to the concepts or ideas people naturally search as opposed to a string of keywords. These overarching concepts, or “entities”, are shaping the future of search, and are the foundation of the Knowledge Graph.

According to Google’s original blog announcing Knowledge Graph:

…we’ve been working on an intelligent model—in geek-speak, a “graph”—that understands real-world entities and their relationships to one another: things, not strings. The Knowledge Graph enables you to search for things, people or places that Google knows about—landmarks, celebrities, cities, sports teams, buildings, geographical features, movies, celestial objects, works of art and more—and instantly get information that’s relevant to your query. This is a critical first step towards building the next generation of search, which taps into the collective intelligence of the web and understands the world a bit more like people do.

Keywords in the traditional sense therefore take secondary importance to the relationships between the terms in the search query–the way a user’s entire search conveys the thing they are looking for. In this way, Knowledge Graph is able to pull information from various sources on the web to create the panel of information users now see in some of their searches. Product manager Emily Moxley compares Knowledge Graph to a highway system, telling Steven Levy:

…you sort of get off that exit and say, ‘Okay, what are some possible Knowledge Graph things that might be interesting for this query?’ — and we search of all these documents and return relevant ones. Then you join back up with 95 and we say, ‘Okay, we thought this stuff was interesting, so let’s surface that information more prominently.’”

Knowledge Graph’s primary sources are Wikipedia, Freebase, and the CIA World Factbook, but as it continues to evolve, it pulls from more and more sites across the internet. Knowledge Graph information also powers event listings in Google Maps and notifications in Google Now. Herein lies the importance for SEOs looking to “get into” the Knowledge Graph:

  • to have content that is optimized enough for semantic search that it’s deemed relevant for thematically similar queries and is pulled into the Knowledge Graph as a listed source, and
  • to be able to submit content to Knowledge Graph sources such as Wikipedia, and thus have some level of control over third party information about your brand that might be pulled into a Knowledge Graph result.

Optimizing for Knowledge Graph

A search engine that focuses on the entities present on a webpage means that the quality of the information on your site matters most. A keyword strategy centered on keyword density and a fixed set of targeted keywords per page will no longer cut it. Your website’s content must be consistent, authoritative, and thematically relevant. Content must contain entities that are relevant to both your vertical and the kind of searches you’re hoping to get traffic for.

Knowledge Graph interprets entities in two different ways: implicitly and explicitly. Implicit entities are the elements in the actual text of your website. They are the topics of content on your site–the natural language you use to present information to the user. To an extent, they are also the keywords you’re going after; but you should treat those keywords as overarching guides for the creation of content. In order optimize for Knowledge Graph, identify the topics and information that matters to your audience and create content around that. Make sure the topic of each page is clear and that the content relates to that focus.

Explicit entities are the structured data markup on each web page; the information Google crawls to determine what a page is about. This will allow the search engine to identify the entities on your page and determine the relationships between them. This is crucial to getting into Knowledge Graph, and the best way is to mark up your site using a structured data markup from Schema.org. Mark up each page, ensuring that the elements are properly identified, so that the information on your page is readily crawlable. It is essential that both your implicit and explicit entities are telling Google the same thing: your data markup must say that your page is about what your content discusses, and vice versa.

Knowledge Graph won’t appear for a hundred percent of all searches–at least not for a while. There is a chance that none of your content is currently being pulled into a Knowledge Graph result, and that all of your organic search traffic is driven by SERPs. But 25% is a significant number, and in all likelihood, that number will continue to grow. It is representative of an overall change in the way search engines think, and the way we must adapt to keep up with that. Consider optimizing for Knowledge Graph–that is, essentially, optimizing for semantic search–as optimizing for the future.

European Parliament Votes to Break Up Google Businesses

European Parliament Votes to Break Up Google Businesses

In a brazen move, the European Parliament passed a vote on November 27th to break up Google’s search and advertising businesses. The vote comes after a more than four-year long antitrust investigation launched by the European Commission in response to allegations that Google has formed a monopoly in European search markets. The complaints, which were raised by a variety of European publishers and political parties, suggested Google’s dominance over the internet lead to a disproportionate emphasis on Google-owned properties that severely restricted competitors’ share of the market.

The vote, while significant in its implications, is wholly symbolic. The European Parliament has no legal right to force the dissolution of an American company’s properties, but according to the Financial Times, the decision represents a substantial amount of pressure on the European Commission, which does control legislation. The Parliament’s increasing influence over the Commission can potentially shape how legislators respond to the decision, and how they proceed in allaying accusations against Google.

The Case Against Google

In November 2010, the European Commission was tasked with determining whether or not legal action must be taken to “unbundle” Google’s search engine from its other commercial products, and prohibit Google from giving preferential placement to its own products or advertising partners in search results.

Despite mounting pressure from complainants, the European Commission remained determined to settle the matter without the need for legal proceedings. In 2012, Jaoquin Almunia, Vice President of the European Commission on Competition Policy, released a statement insisting that “these fast-moving markets would particularly benefit from a quick resolution of the competition issues identified. Restoring competition swiftly to the benefit of users at an early stage is always preferable to lengthy proceedings…” The statement identified four key areas of Google’s exertion of dominance:

  1. Vertical Search Services: Google prominently displays links to its own specialized services, such as Google Shopping, Maps, YouTube, and Google News.

  2. Copying Competitor Materials: Google copies original content from competitors and presents the material as its own answers to search queries. According to Almunia, “they are appropriating the benefits of the investments of competitors..this could reduce competitors’ incentives to invest in the creation of original content for the benefit of internet users. This practice may impact for instance travel sites or sites providing restaurant guides.”

  3. Google Advertising Partnerships: Google restricts the placement of advertisements to websites who have purchased ads with Google.

  4. Google AdWords: Google restricts the portability of AdWords campaigns, preventing advertisers from transferring their search advertising campaigns to other platforms.

Google expressed a willingness to cooperate with the European Commission to address these issues. This past February, according to the New York Times, Google agreed to a settlement to avoid fines and a “finding of wrongdoing”. The settlement, which would allow competitors to buy top-ranking spaces in search results, was met with brutal criticism from the parties who launched the complaints against Google. Back in September, Almunia told Bloomberg TV:

We received a lot of complaints. We have been trying to obtain from Google proposals to overcome the difficulties and the concerns. Now with the last version of proposals we came back to the complainants. The complainants sent us replies during the summer. Some of these replies are very very negative…complainants have introduced new arguments, new data, new considerations so we now need to analyse this and to see if we can find solutions…it’s a long investigation, it’s a complex issue.

The complaints against the settlement led to the European Commission’s reopening of the case, according to The Guardian, and ultimately, to Parliament’s vote to break Google up.

Free Search, Free Speech?

A case like this raises questions about the legality of a search engine’s authority over its own search results. Is Google breaking the law by favoring their services? Are they betraying a code of ethics? Similar cases against Google in the United States have yielded opposite results. In a case brought up by CoastNews, a regional weekly newspaper, a San Francisco Court ruled in favor of Google, contending that search is a matter of “free speech”, and that Google is entitled to display search results however it wants.

According to The Telegraph, “the ruling underlines the stark difference in how US and European authorities approach the issue of search engine regulation. In Europe, regulators are in the process of imposing a series of measures – such as forcing Google to display rivals’ ads in prominent places – to address the company’s allegedly anti-competitive practices.” The First Amendment’s guarantee of freedom of speech in the United States, however, has potentially strong implications on Google’s practices domestically.

Google Responds

Is Google a monopoly? Google Chairman Eric Schmidt contends they are not. He recently spoke to an audience of thought leaders in Berlin to deny these claims, citing Amazon as a viable competitor for Google:

People don’t think of Amazon as search, but if you are looking for something to buy you are more often than not looking for it on Amazon…They are obviously more focused on the commerce side of the equation, but, at their roots, they are answering users’ questions and searches just as we are.

This sentiment was reiterated in an open letter to the Financial Times, where Schmidt insisted that Google is not the “gateway to the internet”, and that with the rise of mobile apps, eCommerce, social media, and online directories, people now go directly to other sources when searching for information.

Whether or not Google is a monopoly is up for debate, and the impact that the European Parliament’s symbolic vote will have on the European Commission’s legal actions against Google remain to be seen.

Amazon vs. Google: The Growing Influence of e-Commerce

Amazon vs. Google: The Growing Influence of e-Commerce

In a recent speech delivered at the Berlin headquarters for Native Instruments, Google’s executive chairman, Eric Schmidt, stated that when it comes to competition, Amazon represents the biggest threat to the search engine. “Many people think our main competition is Bing or Yahoo. But, really, our biggest search competitor is Amazon,” said Schmidt. His statement came in response to accusations* that Google has become a monopoly by dominating the search market.

According to NetMarketShare, Google leads the global search marketplace with approximately 58% of all worldwide searches. While this is hardly a monopoly, it’s substantially more than Bing or Yahoo’s 8.10% and 4.01%, respectively, and still far ahead of the first runner-up, Baidu, which accounts for 29.06% of global searches. Compared to other search engines, Google does enjoy a level of dominance. So, why is Google weary of Amazon’s growing influence? How do e-commerce sites constitute a viable threat to search engines?

Amazon: The “Next Google”?

In theory, an e-commerce site like Amazon, which is driven by product sales, should not be in direct competition with a search engine. Yet Eric Schmidt argues that Amazon accounts for a definite loss in Google searches. Says Schmidt:

“People don’t think of Amazon as search, but if you are looking for something to buy you are more often than not looking for it on Amazon…They are obviously more focused on the commerce side of the equation, but, at their roots, they are answering users’ questions and searches just as we are.”

In this regard, Amazon becomes its own “e-commerce search engine.” Rather than searching for products on Google, people are increasingly going directly to Amazon to find what they’re looking for. Google becomes the unnecessary middle man, and thus they experience losses. The number of searches decreases as users look elsewhere for products, as does ad revenue. People searching on Amazon do not see the same paid advertisements they see in Google search results, so Google is losing money on clicks.

Google is still a major player in the search market, but these trends prove that Amazon has gained enough traction to become a significant threat in the e-commerce market. According to Schmidt, “Research by the Forrester group found that last year almost a third of people looking to buy something started on Amazon — that’s more than twice the number who went straight to Google.” Add to this the fact that listings on Google Shopping are paid listings, and the numbers speak for themselves: users are searching for products where they know they can find exactly what they’re looking for.

Google Fights Back

Amazon continues to grow. In 2012, The New York Times reported that product searches on Amazon grew by a whopping 73% in just one year. It is the world’s largest e-commerce site, and its efforts to diversify its product offerings — with ebooks, streaming services, and mobile devices — suggests this growth will continue in other verticals. Google has attempted to recapture its share of that audience and stay abreast of the competition. They’ve recently expanded Google Express (an expedited delivery service in the vein of Amazon Prime) to more cities in the hopes of keeping users on Google Shopping. They’ve also revamped their product listing ads to be more dynamic and engaging. Now, when users search for products, they are taken to an Amazon-like “digital showroom” where they can browse directly on Google.

Amazon is still a long ways away from outweighing Google as a whole, but their increasing momentum is enough for the search engine to be concerned. As Schmidt says,

“The next Google won’t do what Google does, just as Google didn’t do what AOL did. Inventions are always dynamic, and the resulting upheavals should make us confident that the future won’t be static. This is the process of innovation.”

The assumption that Google would be eclipsed by another search engine is something many SEOs have fallen into. Moz argues that SEOs are actually ignoring Amazon altogether, and missing out on important opportunities. Ranking well on search is one thing, but ranking well on e-commerce sites is another entirely. In short, if there’s something that makes Google take notice, it’s something we should pay attention to.

*The accusations made against Google are part of a larger anti-trust investigation launched by the European Commission in 2010 to determine whether or not Google holds a monopoly over Internet and Mobile Search. The investigation has taken four years, and the ruling is forthcoming.

HTTPS Everywhere: Where is it Now?

HTTPS Everywhere: Where is it Now?

HTTPS

It has been almost a month since Google announced that it would begin using HTTPS as a new ranking signal. While the announcement, which was made on August 6, 2014, was long-rumored, it was met with questions as to what affect HTTPS would have on sites. This new factor in rankings is not part of an algorithm update. Unlike Panda or Penguin, it is its own separate signal, which begs the question: how strong is the signal?

SEOs and webmasters are working to balance the logistics of the migration of a site from a regular connection (HTTP) to a secure connection (HTTPS), which can be costly and labor intensive, with the urgency to make the move. For now, it seems as if the changes are minimal, but it is still important to prepare for the possibility that HTTPS will begin to play a larger role in how your site ranks.

What is HTTPS?

HTTP stands for Hypertext Transfer Protocol and is essentially the way websites communicate—the connection through which data is transferred between websites. Most web traffic goes through HTTP connections, which are non-encrypted, and can therefore be read by the server. This is how we are able to glean data on site visitors that help us determine user intent, search queries, and other potentially identifying information.

The “S” in HTTPS stands for “secure”. HTTPS sites use an SSL 0248-bit key certificate to establish a secure connection. SSL—or Secure Sockets Layer—certificates encrypt the connection so that user data cannot be read. It is a means of establishing trustworthiness and protecting the privacy of users. An HTTPS site is a site that uses an SSL certificate to encrypt data from visitors to their site.

HTTPS Everywhere

Google’s consideration of HTTPS as a ranking signal comes as part of an overall strategy to promote security and user safety across the Internet in general. Earlier this year, Google called for “HTTPS Everywhere”, urging webmasters to consider adding SSL certificates to their sites and alluding to the possibility that HTTPS might become a ranking signal at some point down the road. Google already uses HTTPS encryptions for most of their own services, such as Search, Gmail, and Google Drive, and is now encouraging others to follow suit.

HTTPS: Now or Later?

There has been much debate as to when webmasters should act. The migration from HTTP to HTTPS is an involved and complicated process, as each HTTPS is not site-wide. Each URL must me migrated from HTTP to HTTPS using 301 redirects, and such a large number of redirects can negatively impact rankings. The upside of this, however, is that you can migrate your site in sections—allowing you to prioritize pages and test the effects of the migration.

SSL certificates are also costly, so migrating to HTTPS may be cost prohibitive for smaller businesses. Google has, however, expressed that the rankings boost is minor—for now, at least—and that it is only affecting a very small amount of searches. According to Google:

For now it’s only a very lightweight signal — affecting fewer than 1% of global queries, and carrying less weight than other signals such as high-quality content — while we give webmasters time to switch to HTTPS. But over time, we may decide to strengthen it, because we’d like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web.

For SEOs, HTTPS should, for now, play a limited role in your overall strategy, and for webmasters, there is time to make the switch. In fact, in the weeks since the announcement, it has come under debate whether or not this new ranking signal has even taken effect. SearchMetrics said they have seen no discernible change in rankings since Google made the announcement—that there is no data to support that HTTPS is even a factor. However, this could be because the ranking signal is so light, or because such a low percentage of global search queries are being affected.

Whether or not the rollout has not taken effect, or its impact is still too minor to be measured, the fact of the matter is the announcement is only a few weeks old, and it will take time before we are able to really measure the data. The important thing is that there is time. There is time to weigh the pros and cons of HTTPS, decide how to go about the transition, and actually implement the migration.

12 Completely Appropriate Responses to Google in GIFS

12 Completely Appropriate Responses to Google in GIFS

Google is constantly changing; this summer alone, there have been several major updates. It’s all in a day’s work for SEOs, and Google does a pretty good job keeping us on our toes. Don’t get whiplash trying to keep up with all the algorithm updates: stay cool, roll with the punches, and when in doubt, react accordingly:

The morning when you notice a change in traffic that indicates an unnamed update and you think you’ve figured Google out. Figuring Google Out


When diving into your site’s traffic to investigate said unnamed update, and you see a significant jump in traffic…. Increase in Site Traffic


….or a significant drop. Decrease in Site Traffic


Trying to evaluate your keyword performance, but everything is (not provided). Keyword Not Provided


When Google rolls out another Panda update… Panda Updates


…or a Penguin update. Penguin Updates


The day Google announced Hummingbird Google Hummingbird


When Google limited the number of characters displayed in title tags… Title Tag Character Length


…and when they dropped Authorship photos.  Authorship Images Dropped


Last week, when Google announced the HTTPS/SSL update, using site encryption as a ranking signal. HTTPS:SSL


What Google’s really saying when they tell us our sites won’t be affected if we’re using white-hat SEO tactics. White Hat SEO Tactics


5 Tips on Optimizing for Semantic Search

5 Tips on Optimizing for Semantic Search

The Hummingbird update to the Google algorithm in the fall of last year was, and continues to be, a game changer for SEO. Unlike the Penguin and Panda updates to come before—both partial updates to the existing algorithm—Hummingbird was a completely new algorithm that changed much of the way that search functions. Known for its focus on providing intuitive search results based primarily on user intent, Hummingbird understands the relationship of keywords or phrases to one another and uses this to rank websites relevant to what the user has searched.

Semantic Search

Semantic, or conversational, search existed before Hummingbird, although the new algorithm is the first to integrate it so completely into its ranking system. Semantic search allows users to search entire sentences or phrases and receive results based on the collective meaning of the keywords in those phrases. While there are many other features of Hummingbird that SEOs should keep in mind, semantic search is important as it changes the way we think about keywords.

Almost one year after the unveiling of Hummingbird, SEOs are still discussing how to optimize for semantic search. Here are five tips to keep in mind:

1.)  Revise Your Keyword Research Strategies

“’Semantics’ refers to the meaning or interpretation of a word or phrase”, according to Search Engine Journal. With that in mind, keyword research is the best place to begin. How will a page on your site’s keywords be interpreted when Google crawls it? By taking a holistic approach and breaking your keywords into three tiers, a well-rounded list that accounts for variations in user intent can be presented to the search engines.

  • Level 1 – Core Keywords: This list is comprised of keywords closely related to your initial target keywords. They should be variations of your targets close enough in meaning so Google can consider your site if any one of the core keywords are searched.
  • Level 2 – Thematic Keywords: Whereas the keywords in List 1 are somewhat synonymous with one another, thematic keywords are further removed from your initial targets, yet are conceptually related. If your target keyword for a page is “manhattan realtors”, a list of thematic terms like “new york city apartments” can help you potentially rank for the query, “low rent new york city apartments”.
  • Level 3 – Stem Keywords: Your third level should include keywords that answer users’ questions. These keywords anticipate the information users need after they have found your page, and should be integrated into the content to naturally provide answers. Once a user has found your page by searching “low rent new york city apartments”, it is likely they are seeking information on “finding low rent new york city apartments” or “renting affordable new york city apartments. Your Level 3 Keywords may be some variation of “rent new york city apartments” or “new york city apartment listings”. 

Ideally, keywords from all three levels should be incorporated into your content.

2.)  Create a Robust Content Outline

This isn’t new to Hummingbird—its just good SEO. But with Hummingbird focusing on user intent, pages that rank are pages that, in addition to being optimized, provide relevant content.

Keyword stuffing has never been best practice. When optimizing for semantic search, it is important that your keywords are well placed, organically integrated into your content, and not overused. Create content around the questions and searches your keywords seek to answer. Outline which keywords are targeted for which pages, and how the content on each page will meet the user’s needs. From there, build out your content to be robust and to directly relate to the keywords you’ve delegated to that page.

3.)  Integrate Social Media

Social media is a significant component of Hummingbird’s personalization of search results. Google can now provide more refined, intuitive results pulling from personal data in users’ social media profiles. Search Engine Land had a great article on the topic that stated:

This suggests that paying attention to social search is becoming more and more critical, and that social media is playing a larger role in search results, sending strong signals to the search engines. Leveraging (and discovering) your target audience’s interest graph is key to producing content that will bring them to your website.

Essentially, fully integrating social media into your marketing strategy—creating content that reflects your audience’s social interests and sharing that content on your networks—is a large part of optimizing for semantic search.

4.)  Structure Your Data

Semantic search depends on structured data. Information on your website needs to be properly tagged, marked up, and organized in order for search engines to crawl your page. More than ever before, SEOs must be well versed with the back end technical details of a website that search engines recognize as indicators of a page’s relevancy. Working closely with webmasters, or familiarizing yourself with HTML markups will ensure that your site is easily crawlable. Schema.org provides a collection of templates and markups that Google, Bing, Yahoo!, and Yandex rely on.

5.)  Continue What You’re Doing

Since the launch of Hummingbird, Google has maintained that not much should change day-to-day for SEOs. While there are some specific things SEOs should keep in mind when optimizing for semantic search, that statement is, for the most part, true. Hummingbird rewards SEO best practices, and actually penalizes sites that use black hat tactics. A poorly optimized page will simply not rank. But pages that are properly marked up and focus on user intent will benefit.

Lastly, cross-pollination is important: collaborate with social media and other disciplines in search marketing, like SEM. Search engines are becoming more intelligent and responsive to users, and SEOs also need to think about how to anticipate what users ultimately need out of their search.

Google Authorship: Humanizing Your Content to Increase Rankings

Google Authorship: Humanizing Your Content to Increase Rankings

Though Google+ has grown rapidly since its launch in 2011–reaching 540 million active users this past October–Facebook remains the world’s largest social network (Google+ is the second). Nonetheless, the relevance of Google+ as an engaging social platform, and its importance to web marketers as a Google-owned property, cannot be debated. Described by Google itself as a “social layer”, the purpose of the tool is to act as an interlay, allowing users to interact socially with google-enhanced websites and services.

One way that Google is doing this is through Google Authorship, which Forbes is calling one of the Top 7 SEO Trends Dominating 2014. Google Authorship is an important factor in the Google algorithm that aims to surface quality content in search rankings based, in part, on who authored it. Content is no longer enough to help a website rank for search terms. Google’s algorithm is continually evolving to highlight websites with content that is authoritative, engaging, diverse, and, just as importantly, written by an expert in the field.

With more and more businesses implementing Google Authorship in their marketing strategy, it is important to know the basics so you can determine if it is relevant to your business. Chances are, it is.

What is Google Authorship?

Google+ and Google Authorship go hand-in-hand. The wide integration of Google+ profiles into Google’s other properties, including its search engine (and, consequently, search results), has essentially humanized, or socialized, the way we search. According to The Huffington Post, Google Authorship was created “with the goal of allowing writers to claim their content, as well as allowing search engine users to find more content written by the same writer.” This allows users to see the face behind the content they’re reading, and adds to the trustworthiness of the website. If a user can see that one person is writing on the same topic across different sites, it establishes that person’s reputation as an expert in the space.

Authorship does not only allow users to find content created by a reputable writer; it also allows Google to do the same. The concept was implemented to reduce spam and improve the quality of content, and, by extension, of search results. As the algorithm evolved to incorporate social media more actively into search results, Google+ became an important tool for sites to improve their rankings. That is exactly how Authorship works: by creating a solid Google+, writers can link their profiles to the content they create, allowing Google, and the user, to infer authority through their social presence.

With Google+ allowing authors to “claim their content”, how do businesses ensure that their content is written by an authority figure? While not every business is in the position to source expensive content externally from prominent figures in their space, Forbes suggests “designating a person within your organization to take ownership of contributing great content to leading online publications…to become your true thought leader.”

What Author Rank Means for Your Page Rank

Google Authorship implies that eventually Author Rank will become as much, if not more, relevant than page rank. As an author’s web presence becomes more prominent through the sheer amount of content they have authored, and the number of sites to which they contribute linked to on their Google+, they become an authority. Ultimately, their rank will factor into search results. It is even speculated that websites that have not implemented Google Authorship will eventually be phased out of search results entirely (though that is just speculation at this point).

An excellent infographic by Internet Marketing Inc. (included below) highlights the importance of Author Rank. Author Rank is directly impacted by the writer’s social activity on Google+ and other social media platforms. The amount of engagement on Google+, including frequency of posts, number of connections, the number of shares and +1’s their posts receive, and comments all factor in. And since Author Rank is tied directly to the writer’s content, not the site on which it is posted, Google’s continued emphasis on Authorship means that the value lies in the writer, not necessarily in the site. Your page rank will always matter, but you will benefit directly from the writer’s Author Rank.

Because the content matters as much as the page on which it is hosted, Internet Marketing Inc. recommends guest posting as a way to improve Author Rank, in addition to active social engagement. Linking a writer’s Google+ to their content means that a snippet of their profile, including their picture, is included in search results. It has been proven that Authorship can increase click-through rates by 30% to 150%, all by humanizing your content. A friendly face can do wonders for the trustworthiness of your site, and the frequency of clicks.

Check out The Huffington Post’s Complete Guide to Google Authorship for a thorough breakdown of how to effectively set up a Google+ profile, and scroll down for the full infographic from Internet Marketing Inc.

Facts Behind Google Author Rank & Authorship [Infographic]

Post this on your site (Embed Code):