Name:
Location: Columbus, Ohio, United States

Friday, April 28, 2006

discussing the Google sandbox

In the old days, it seemed the monthly update cycle was due to the three weeks needed to go through all the iterations to resolve PageRank. Definitely, PR is calculated differently these days. However, why would PR iterations slow down so much? I'm believe in the complete replacement of PR as the basis of website ranking (although it probably plays a part later)

My thoughts on the Sandbox revolve around Google no longer basing it on the page but on the Search Phrase. After the Florida update, we could compare two radically different sets of results using the -asdf string. It seems the new results were not a filtered subset of the old PR-based results. Instead, they were an entirely different set of results. Obviously, there were some results in common, but not as far as ordering.

Myspace
Myspace Backgrounds
Myspace Layouts
Xanga
Xanga Layouts

These search phrases were called 'money phrases' because they were primarily centred on phrases targeted by commercial web sites. The number of covered phrases expanded massively in August, 2004. It covered non-money phrases. In addition, there was a 'website filter' applied. It would suddenly reduce all the website rankings of pages, massively. This was for all the search phrases Google recognised.

LiveJournal
Free LiveJournal Layouts
Free Myspace Graphics
Myspace codes
HTML codes

Following this line of thinking, you see a possible explanation for the way the Sandbox works. You create a website. At first, the web pages are 'folded in' to the results. At some future time, between a few days and a few weeks, they are given more permanent ranking within the search phrase.

backgrounds
Layouts
Profiles
Myspace profiles
animated icons

Maybe, Google does a 'do we trust this website' calculation. It is a combination of Trustrank, age of links, nature of links, power of links, similarity in pattern of links to spam networks, or whatever. After this, the traditional algorithm, including PageRank, is applied.

cursors
animated cursors
animated graphics
Xanga cursors
Myspace cursors

The idea of 'trusting websites' relating to a single iteration of some set of factors is interesting. It was thought 'search phrase' ranking (such as Latent Semantic Indexing) was beyond the current computing power for a large number of search phrases. But some people who wrote papers, on these subjects, ended up working at Google.

0 Comments:

Post a Comment

<< Home