Language Selection

English French German Italian Portuguese Spanish

It's not how big it is, it's how you use it

Filed under
Web

Last week saw the resumption of the search engine size wars in which one major search engine claims to be larger than its rivals, prompting those rivals to rapidly upsize themselves. Yahoo fired the first round at Google, claiming to have over 20 billion objects accessible in their database. Google, which can only claim about 13 billion objects fired back with questions about measurements, basically stating Yahoo was mistaken or misleading in its claims.

Others got in on the act and the blog-o-sphere was full of stories about Yahoo's obsession with size. By the beginning of this week, the search marketing community was fed up with being fed tripe about the importance of size, as reflected August 16 th in Danny Sullivan's post to Search Engine Watch, " Screw Size! I dare Google and Yahoo to Report on Relevancy "

The frustration with the major search engines felt by serious search marketers is real. Our clients don't care about size and neither does their money. They care about being found when searchers are seeking information about products or services they sell. They care about potential clients and their ability to present information to them. They care about being relevant.

Search engine users don't really care about size either. Given the mind-boggling amount of data available via even the smallest of the major search engines, most users have no idea of the depth of search results, as they tend to look only at the Top10 or 20 listings. Even if Yahoo returns thousands more references than Google for any given keyword query, both know that only the first 20 links tend see any measurable traffic. Again, it isn't about being the biggest; it is about being the best. Being biggest does not necessarily mean being best.

There is no real scientific method of proving which search engine is the biggest, and no real way to gauge which one is best. That's not to say folks aren't trying though. The thing to remember is, "best" means something slightly different to every search engine user.

Full Article.

More in Tux Machines

'Open' Processor

  • 25-core open source chip could pave way for monster 200,000-core PC
    PRINCETON UNIVERSITY BOFFINS have developed a 25-core open source processor that can be scaled to create a monster 200,000-core PC stuffed with 8,000 64-bit chips. The chip is called Piton after the metal spikes driven by rock climbers into mountain sides, and was presented at the Hot Chips symposium on high-performance computing in Cupertino this week.
  • New microchip demonstrates efficiency and scalable design
    Researchers at Princeton University have built a new computer chip that promises to boost performance of data centers that lie at the core of online services from email to social media. [...] Other Princeton researchers involved in the project since its 2013 inception are Yaosheng Fu, Tri Nguyen, Yanqi Zhou, Jonathan Balkind, Alexey Lavrov, Matthew Matl, Xiaohua Liang, and Samuel Payne, who is now at NVIDIA. The Princeton team designed the Piton chip, which was manufactured for the research team by IBM. Primary funding for the project has come from the National Science Foundation, the Defense Advanced Research Projects Agency, and the Air Force Office of Scientific Research.
  • Manycore ‘Piton’ Climbs Toward 200,000-Core Peak

Android Leftovers

Lubuntu 16.10 Beta Out Now with Linux Kernel 4.4 LTS and the Latest LXDE Desktop

As part of today's Ubuntu 16.10 (Yakkety Yak) Beta launch, Simon Quigley from the Lubuntu Linux team released the first Beta build of the upcoming Lubuntu 16.10 operating system. Read more Also: Ubuntu MATE 16.10 (Yakkety Yak) Beta Removes the Heads-Up Display (HUD) Feature Ubuntu GNOME 16.10 Beta 1 Released with GNOME 3.20 and GNOME 3.22 Beta Apps Ubuntu 16.10 "Yakkety Yak" Beta Released, Ubuntu GNOME Has Experimental Wayland

Facebook open sources its computer vision tools