1searchSearch engines have become essential for doing business, since hardly any Internet user can dispense with the use of search engines. Obviously, barely a user enters the URL directly into the browser or waives using a search engines entirely. The majority starts their search by typing in a keyword as they often do not have the direct URL. Only sites such as Facebook, eBay or Amazon can enjoy having a great reputation, so that they are mainly called up directly via their web address.

Consumers in the Asian region spend, for instance, on average 17 hours a month online, with Google being the leading search site, as suspected, followed by Yahoo! - but Tencent leading in visitor engagement. And although Google is driving the share of search traffic in the Asia Pacific region, Baidu dominates the search landscape in China (and Navar in South Korea, according to a study by Digital Portal).

The heaviest searchers in Asia are surprisingly the Thais, with 143.3 searches per user a month, outdoing India and China that nevertheless have impressive 94.5 and 78.3 searches per user.

However, particularly popular in Asia Pacific is mobile search, Digital Portal’s study revealed. Therefore, the search quality is amongst the highest ranked when it comes to optimizing for mobile search as well. In fact, search traffic from mobile devices, such as smartphones and tablets, are steadily on the rise and have already exceeded search from computers. The report states that in 2014, there are more than 895 million of mobile internet users in the region and that number of mobile internet users will keep growing to reach 1.29 billion by 2016.

What are search engines looking for and how?

Looking at such facts and figures, it is more interesting to find out how search engines work in the first place, don’t you think? Well, most search engines are index based and browse the web by using automated programs, called web crawlers, robots or spiders. Thus, in case of a search request, not the entire network is scanned, but only the index.

The main principle of any search engine is generally very similar, since search engines are in fact nothing else than software programs that constantly comb through the Internet for qualitative content that is suitable for the user. Some web pages are scanned and sorted daily or even hourly for new content, so that the databases of the search engines are always up to date.

Here, the robot of the search engine examines first of all a website, while following links (images and text links). Then, the site is scanned and read page by page, and the texts and HTML elements are stored in interconnected databases. These in turn are then assessed and indexed according to specific criteria.

What is scanned, which elements are important, and what criteria will be considered?

The robot or spider is looking primarily for content that it stores in the database of the search engine. HTML elements are indexed to the furthest extent, however, Flash content only partly and Java scripts and Java Applets just little or not at all. That’s why search engines need a lot of text and good content. Websites with little text, but with a lot of graphics, are hardly noticed by search engines. For search engine optimization (SEO), texts and meaningful keywords are therefore the alpha and omega to get top ranking. Websites with a high number of Likes, Shares and Tweets also have a better chance of a good ranking, making clear why creating content that gets shared often is becoming more important.

How the various elements are assessed in detail by the search engines is specified by the operators. Google, for instance, has more than 200 criteria to assess the pages where the search results are displayed, the so-called Search Engine Result Pages (SERPs). Some ranking factors, which are used for the ranking of web pages, are kept a secret by search engine providers– for good reasons – as there are some shady companies and individuals that try to manipulate search results.

On the one hand, they use spamming, by flooding the search engines literally with senseless content. On the other hand, they try cloaking, which means that specially created pages should deceive the search engines. Another dubious practice is the use of doorway pages, where thousands of pages are generated that link to each other. Most practices of spammers, however, are already well known by search engines, allowing them to prevent the cheating or punish the deceiver – sometimes by completely deleting the website from the index of search engines. Therefore, beware from these practices and refrain from companies that offer to optimize your website that way.

By Daniela La Marca