Articles on Google Matters:
Personalized Search - All's Well Or Orwell?
You go to Google and enter your search term. Big Brother, the totalitarian character from George Orwell's novel 1984, watches with detached interest. You see, to Big Brother, you are only a number - but he'd like to know as much about you as he can. Knowing you allows Big Brother to do many things - both good and evil.
Alright, enough of the "Big Brother" comparison - it's been done many times before (and done many times better). However, there is an important central point to be made about personalized search. Google is now (and has been for some time) collecting data on individual users, and they are assuming that users will trust them with this data to "Do No Evil," as their famous slogan goes. Only time will tell whether the trust is well-placed, or if people are willing to trust search engines with this type of data at all.
The basic principle behind personalized search is simple. When you go to Google and type in a search query, Google stores the data. As you return to the engine, a profile of your search habits is built up over time. With this information, Google can understand more about your interests and serve up more relevant search results.
For instance, let's say that you have shown an interest in the topic of sport fishing in your search queries, while your neighbor has shown an interest in musical instruments in his search queries. Over time, as these preferences are made clear to the engine, your personalized search results for the term "bass" will largely be comprised of results that cover the fish while your neighbor's results for "bass" will be comprised of results that primarily cover the musical instrument.
At present, you need to have signed up for a Google service for your results to be personalized. Such services include Gmail, AdWords, Google Toolbar, and many others. By default, as long as you are signed in to one of these programs, your personal search data will be collected. The term "at present" is used because Google certainly could implement personalized search on any user of the engine, regardless of whether he or she has a Google account. Google already places a cookie, or unique identifier, on the machine of anyone who types in a search query on Google - it would not be hard for them to use that information, rather than the Google account, to collect individual user data and personalize results. It is quite possible that Google is testing the waters of personalized search with people who have opted in to one of its services and will expand the system to all users if there is limited uproar or government intervention.
For search engine optimization firms, the major shift brought about by personalized search will be in how they report on Google ranking data to clients. When collecting this data, they will have to run from a "clean" machine - that is, one that has no Google programs or cookies on it. The baseline results that are reported to the client will essentially be a snapshot of what a search engine user would see if they had no Google software installed. The good news is that Google account holders who have shown an interest in certain products and services will likely have results more favorable to the client than the baseline results indicate since personalized search assures that their search histories will be reviewed and the results likely skewed toward the client's industry. The bad news is that the search engine optimization firm will be hard-pressed to demonstrate this - not to mention that the results that the client using a Google program has on its own personal machines will almost certainly not match up with the results that the firm is reporting (although the client machines should have better results, for the same reasons cited above).
Some people find the practice of storing information for personalized search purposes disturbing; others find the end result to be useful (still others find themselves experiencing an odd combination of both reactions). In defense of the engines, it is not as if they are building a dossier on individuals - again, you are only a number to them. However, the potential for misuse of the data is fairly high.
There are many advertising firms out there already that go through the cookies on your machine to figure out which ads will have the best effect on you. If you've ever been on a website and seen a banner ad that is directly related to something you have been doing research on lately, it is most likely not a coincidence. The ad platform simply browsed through the cookies on your machine to find out what topic held your interest, and dropped in a related ad once it determined what that topic was. Search engines have been buying firms with this technology lately; notable recent purchases include that of DoubleClick by Google and aQuantive by Microsoft. There seems to be little doubt that your search history will be combined with existing ad-serving technology to deliver even more relevant ads. Whether this constitutes misuse seems to be debatable - some people seem to have no problem with it, while it makes many others fairly uneasy.
Privacy issues that arise from personalized search are also a big question. The EU recently announced that it is probing into how long Google stores user information (this probe was subsequently extended to include all search engines). AOL recently committed a serious blunder when it released search data from 500,000 of its users, and it was discovered that it was fairly easy to identify many people by the search terms that they use (anybody ever "ego surf" - that is, type your own name into a search engine to see what comes up? If so, you wouldn't be hard to spot). In addition, since the IP address of the computer creating the query is also reportedly tracked, a court order forcing the engine and the ISP (Internet Service Provider) to provide specific search data on individuals is a distinct possibility - the technology required to deliver upon such a demand is already in use.
Unless the government intervenes, the question will probably be decided by personal preference. As it becomes more common knowledge that Google (and other engines) store this type of data to enable personalized search, many users will take measures to block its use.
Are the search engines that collect this data "Doing No Evil?" The answer, I believe, will depend on each individual's definition of evil. In the meantime, don't be surprised when you type in a search query, and the engine seems to be reading your mind. It isn't, really - it's merely parsing through your memories.
Scott Buresh
Scott Buresh is the founder of Medium Blue, a search engine optimization company. His articles have appeared in numerous publications, including MarketingProfs, ZDNet, SiteProNews, DarwinMag, ISEDB.com, and Search Engine Guide. He was also a contributor to Building Your Business with Google For Dummies (Wiley, 2004). Medium Blue has local and national clients, including Boston Scientific, Cirronet, and DS Waters, and was recently named the number one search engine optimization company in the world by PromotionWorld. Visit www.mediumblue.com to request a custom SEO guarantee based on your goals and your data.
Copyright © 2006 - 2007 Tons Of Matters.com. All rights reserved.
Tons of Matters.com
If you matter, then we matter!
Google Algorithm Update Analysis
Anybody who monitors their rankings with the same vigor that we in the SEO community do will have noticed some fairly dramatic shifts in the algorithm starting last Thursday (July 5th) and continuing through the weekend. Many sites are rocketing into the top 10 which, of course, means that many sites are being dropped at the same time. We were fortunate not to have any clients on the losing end of that equation however we have called and emailed the clients who saw sudden jumps into the top positions to warn them that further adjustments are coming. After a weekend of analysis there are some curiosities in the results that simply require further tweaks in the ranking system.
This update seems to have revolved around three main areas: domain age, backlinks and PageRank.
Domain Age
It appears that Google is presently giving a lot of weíght to the age of a domain and, in this SEO's opinion, disproportionately so. While the age of a domain can definitely be used as a factor in determining how solid a company or site is, there are many newer sites that provide some great information and innovative ideas. Unfortunately a lot of these sites got spanked in the last update.
On this tangent I have to say that Google's use of domain age as a whole is a good filter, allowing them to "sandbox" sites on day one to insure that they aren't just being launched to rank quickly for terms. Recalling back to the "wild west days" of SEO when ranking a site was a matter of cramming keywords into content and using questionable methods to generate links quickly I can honestly say that adding in this delay was an excellent step that ensured that the benefits of pumping out domains became extremely limited. So I approve of domain age being used to value a site – to a point.
After a period of time (let's call it a year shall we) the age should and generally has only had a very small influence on a site's ranking with the myriad of other factors overshadowing the site's whois data. This appears to have changed in the recent update with age holding a disproportionate weíght. In a number of instances this has resulted in older, less qualified domains to rank higher than newer sites of higher quality.
This change in the ranking algorithm will most certainly be adjusted as Google works to maximize the searchers experience. We'll get into the "when" question below.
Backlinks
The way that backlinks are being calculated and valued has seen some adjustments in the latest update as well. The way this has been done takes me back a couple years to the more easily gamed Google of old. This statement alone reinforces the fact that adjustments are necessary.
The way backlinks are being valued appears to have lost some grasp on relevancy and placed more importance on sheer numbers. Sites with large, unfocused reciprocal link directories are outranking sites with fewer but more relevant links. Non-reciprocal links lost the "advantages" that they held over reciprocal links until recently.
Essentially the environment is currently such that Google has made itself more easily gamed than it was a week ago. In the current environment, building a reasonably sized site with a large recip link directory (even unfocused) should be enough to get you ranking. For obvious reasons this cannot (and should not) stand indefinitely.
PageRank
On the positive side of the equation, PageRank appears to have lost some of it's importance including the importance of PageRank as it pertains to the value of a backlinks. In my opinion this is a very positive step on Google's part and shows a solid understanding of the fact that PageRank means little in terms of a site's importance. That said, while PageRank is a less than perfect calculation subject to much abuse and manipulation from those pesky people in the SEO community it did serve a purpose and while it needed to be replaced it doesn't appear to have been replaced with anything of substantial value.
A fairly common belief has been that PageRank would be or is being replaced by TrustRank and Google would not give us a green bar to gauge a site's trust on (good call Google). With this in mind one of two things has happened; either Google has decided the TrustRank is irrelevant and so is PageRank and decided to scrap both (unlikely) or they have shifted the weíght from PageRank to TrustRank to some degree and are just now sorting out the issues with their TrustRank calculations (more likely). Issues that may have existed with TrustRank may not have been clear due to it's weíght in the overall algorithm and with this shift reducing the importance of PageRank the issues that face the TrustRank calculations may well be becoming more evident.
In truth, the question is neither here nor there (as important a question as it may be). We will cover why this is in the ...
Conclusion
So what does all of this mean? First, it means that this Thursday or Friday we can expect yet another update to correct some of the issues we've seen rise out of the most current round. This shouldn't surprise anyone too much, we've been seeing regular updates out of Google quite a bit over the past few months.
But what does this mean regarding the aging of domains? While I truly feel that an aging delay or "sandbox" is a solid filter on Google's part – it needs to have a maximum duration. A site from 2000 is not, by default, more relevant than a site from 2004. After a year-or-so the trust of a domain should hold steady or at most, hold a very slight weíght. This is an area we are very likely to see changes in the next update.
As far as backlinks go, we'll see changes in the way they are calculated unless Google is looking to revert back to the issues they had in 2003. Lower PageRank, high relevancy links will once again surpass high quantity, less relevant links. Google is getting extremely good at determining relevancy and so I assume the current algorithm issues have more to do with the weíght assigned to different factors than an inability to properly calculate a link's relevancy.
And in regards to PageRank, Google will likely shift back slightly to what worked and give more importance to PageRank, at least while they figure out what went awry here.
In short, I would expect that with an update late this week or over the weekend we're going to see a shift back to last week's results (or something very close to it) after which they'll work on the issues they've experienced and launch a new (hopefully improved) algorithm shift the following weekend. And so, if you've enjoyed a sudden jump from page 6 to top 3, don't pop the cork on the champagne too quickly, and if you've noticed some drops, don't panic. More adjustments to this algorithm are necessary and, if you've used solid SEO practices and been consistent and varied in your link building tactics – keep at it and your rankings will return.
Dave Davies
Dave Davies is the CEO of Beanstalk Search Engine Positioning, Inc. Beanstalk offers search engine optimization services to businesses small and large as well as providing consulting, training, copywriting and link popularity programs. To keep your site optimized and monitor its results you'll want to bookmark their SEO blog and free SEO tools pages.