Saturday, April 16, 2011

How Search Engine Works

First of all we will understand what is search engine? Search Engine is basically a website that collects information from Internet and display to the user as a result page against his query. Search Engine contains a bot or space where the user enters keyword and press "Search". The result is shown on the next page with links and a short detail.

Search Engines basically use an algorithmic software called Crawler (bot, robot, spider). All of these terms are used for the same purpose. The search engine does not see the website as we do. The crawler visits the website, collects information and process it. To understand the working of crawler, we take an example: Crawler contains a worksheet with fields on it. When it starts visiting the website it fills in the data in appropriate fields. It is quite similar like people who go door to door to collect information of people living in houses to estimate the size of population. The work sheet that crawler contains have such entries like URL, Title, Meta, Content and Links. Each of them is expandable to write details in the worksheet. After collecting the required data from the website the crawler process it and index or store it in search engine database.

There are trillions of website floating all over Internet and these crawlers are being run by set of high end servers to collect most updated information. This information is presented to the user upon his query through search engine.

The information presented to the user may contain web pages, images or news. Everything is presented as a link. User clicks on the required search result and redirected to the page he was looking for. To give most appropriate information search engine crawlers are frequently updating the database of search engine by regular visits on websites.

The best SEO needs the understanding of search engine working. Search engine rate the website according to its text. The text comes under the coding. Coding exactly tells the crawler about title, meta and content. The crawler crawls the website and find most relevant information and among all websites most relevant website is rated on the top. Search Engines further index the pages in their database and process them. It results in to a compiled database of most relevant and appropriate websites. Indexing means tagging the words that best describes your website. That is why the website design should be search engine friendly. It must contain proper tags and relevant information within them. Optimizing the website with proper information in terms of coding will let search engines to give a higher ranking.

When a person enters a phrase or keywords in search engine bot, the search engine compares these words in its database to fetch most relevant information matching the keywords entered. This information is retrieved from indexed database. It is displayed on SERPs as links with some short description. Now there could be millions of websites matching our keywords here is the job of search engine to give us ease by providing most relevant websites on top of its search results. Of course this process of search engine is quite much complex but it all depends on the content, head and links of the website. Their relevancy with the content, title, headings and links.

Different search engines use different algorithm to represent the search result, thus we are given options to optimize our website according to their strategies. Search engines do change their formula time to time which is required to meet the searching criteria of users. SEO expert should keep pace with the search engines changing strategies. This requires a proper SEO plan.

We can not optimize perfectly for every search engine so we should first analyze which search engine is being used by most of the users. After analysis we have come to know that Google is the most popular search engine. More than 90% people look information through Google. Then around 4% user use Yahoo and 3% people use Bing. Our focus must be on Google and then we should also keep some requirements of Yahoo and Bing also which can be accommodated in our optimization process.

5 comments:

  1. check how google search engine works. Video


    http://www.youtube.com/watch?v=BNHR6IQJGZs

    ReplyDelete
  2. i always used to read ur article but current article is really appreciable, waiting for new one,

    ReplyDelete
  3. some questions:

    how many keywords can we use in our website?
    can we repeat same keywords so many times either on other pages?

    ReplyDelete
  4. Thanks for appreciation.

    We can use as many keywords as we want on our website. But keeping in mind that these all must be relevant. Relevant to the content and subject of website. Every page of the website may contain unique relevant keywords. We can write 256 characters for keywords per page. If you have more keywords or phrases you can use them in other pages where they fit the most.

    Keyword repetition will confuse the search engine to rate, display and target page. There is a phenomenon called Keyword Cannibalization. It means that a competition will start within web pages. To some extent some rigid and binded to the content keywords can be repeated but it is wise to use unique keywords or phrases for different pages of website.

    Your comments are always welcome

    ReplyDelete
  5. To understand keyword cannibalization

    http://iqaisarseo.blogspot.com/2011/03/keyword-cannibalization.html

    ReplyDelete