WANTED: 6.8 Billion Googles

It seems to be common sense: if you are searching for something you should be the dominating filter. You in this case might be: male, 38 years old, residing in Southern California, speaking American English, drives a SUV, college education in … etc. The more you inject ‘you’ into a search the less results you would get and consequently you will spend less time looking through results that do not matter to you. So, shouldn’t there be 6.8 Billion Googles?

Instead search engines typically use one filter (algorithm) for all users of their engine, consequently delivering the same result for all its users. Index size does not matter. Even if the latest PR campaign by search engine Cuil seems to suggest the very same thing. More surprisingly, although Cuil apparently does not put any weight on keyword placement (specifically in domain and Title Tag) there are no signs of user centered design. Instead a search for the most obscure keywords will bring ten-thousands of ‘relevant’ SERP’s (search engine results pages) most of which will never see the back light of a monitor.

Do thousands of super-smart search engineers employed by the leading search engines miss the obvious? Not likely! So, why does it seem that the dominating search portals Google, Yahoo or MSN are holding on to patterns that seem inherently flawed?

The answer – again – seems obvious: there is little to no competitive pressure for the existing search engine to correct course. If reducing the the time to find the (only) relevant result by 50% means reducing profits by 50% would YOU do it?