Funny name. Surprisingly, as it turned out (at least according to Kathleen McDivitt, MBA, author of the ebook The New Google SEO) Google recently revised its algorithm with the assistance of a man simply named Panda. Maybe once you reach a certain level of notoriety in the Googlesphere, you can be referenced by a single name a la Cher, Prince, Madonna or Beyonce of the music industry. Meatloaf, does Meatloaf count? I'll Google it later and let you know. Anyway, so this Panda guy says to Google:
Hey, you guys should really revise your algorithm so that it more closely mimics the actions of a user, and I've created a way that we can evaluate how users behave.
So, Google conducted a study where they asked end users to rate sites on three qualities, these being:
Through assessing users' feedback signals, Google recognized that the most emphasis was placed on trustworthiness; Panda (the person) then created a computer program that would also rate sites based on these three things as well, known as Google Panda. The explanation the book provided was that Google Panda is a:
...filter designed to weed out low quality pages. If a site has too many low quality pages, Google now flags the entire site as being untrustworthy.
Google, like any other for-profit institution, looks to improve upon its search results in order to provide not only the fastest answer, to ensure that users keep returning, but that those instantaneous search results offer the user an experience that is both wanted (or relevant) and trustworthy. Google also wants to highlight or provide users with sites which offer the highest levels of engagement, which in turn gives value. Once Google Panda was up, running and flagging sites, it became clear to web developers and SEO professionals that best practices were no longer acceptable, that they were now potentially harmful to an organic search ranking. A number of large, highly ranked sites reported to Google that their once high rankings in search results had either decreased drastically, or had disappeared altogether. Though there are no absolutes in the world of SEO for Google, practitioners have learned there are a few major best practices changes that now affect rankings.
The classic SEO practice said that if sites featured keywords which were generally aligned with what organizations did, and most commonly searched, that they would drive more traffic. Where content managers continue to preach the value and importance of the most popular keyword, or general blanket keywords, efforts should now be refocused to content specific keywords. If you think about it, this makes sense. For example, Briteskies is an IT consulting firm which focuses on eCommerce web design, development and integration between front end interfaces and ERPs. I could create content which featured predominantly commonly searched terms, such as web design or companies that make websites. While these would undoubtedly drive a lot of traffic, users who might land on our site may discover that our content doesn't quite match what they had wanted; perhaps they only really needed a graphic designer, or a hosting company such as MageMojo which "makes" websites appear online. If Briteskies.com continued to drive traffic to content which users deemed irrelevant, Google would look at these as violations as per the algorithmic rules of Panda which state a site must be of a high quality, feature a clean design and be trustworthy. If Briteskies racked up enough of these negative points, Google would then either shove our search results deep into the belly of its page three limbo, or higher, or prevent briteskies.com from appearing in results at all.
Essentially, if you want to ensure that Google doesn't think you're trying to fool anyone with deceptive or inaccurate keywords, use ones that are much more specific to your products and/or services and the actual content of your pages. So, my keyword web design could change to eCommerce website development; it better describes what Briteskies does, thus only driving traffic from users who are really looking for that service, while simultaneously telling Google Panda that we're trustworthy. This practice will also decrease a website's bounce rate, or, the rate at which users land on a page and immediately return to Google's search results. A high bounce rate indicates users are not finding what they were searching for on a particular site, which again, tells Panda something's amiss.
Fortunately, Google's Panda filter runs about every four to seven weeks. If an organization finds that its website has dropped in rank or has vanished completely, there are a few things that can be done to alleviate the situation. First, a company will want to conduct a thorough examination of its site using the aforementioned guidelines. It may be a few updates and a bit of clean up could do the trick, or feed the bear; Panda does its thing and you're flying high again. Or, after examining a site and finding nothing adverse according to the latest of Google's standards, an organization may make an appeal to the Great and Powerful Google itself. As with all technologies and machines and good intentions, Panda is not, nor will it ever be, the end all be all of Google's website filters; it simply happens to be in vogue at the moment.
As permanent or as fleeting of a way Google views websites as this may be, Google makes the rules. Don't believe me? Simply search it on the engine of your choice and tell me what you find.
Next: Google Panda: My Beef with The Bear
Lorem ipsum dolor sit amet, consectetur adipiscing elit
For the past two decades, we've made it our business to help you work smarter. From commerce challenges to ERP customizations, we support the power of your big ideas by helping you work more strategically, more intuitively, and more efficiently.
2658 Scranton Road, Suite 3
Cleveland, Ohio 44113
216.369.3600
No Comments Yet
Let us know what you think