Friday, February 18, 2011

Indexing in SEO training

Indexing is a process of compiling the fetched information from internet. It is just like the index of the book.

Crawler processes each of the pages it crawls in order to compile a massive index of all the words it sees and their location on each page. Crawler can process many, but not all, content types.

The processing of crawling means how often the page should be crawled, nature of page, relevant or irrelevant links, meta descriptions, content, geographical location and of course ALT tags, titles, headings etc. Such information is used to display results. The results are determined on the basis of fetched data or indexed data within search engine. Search engine uses crawler to do the job.

Crawler is an algorithmic based software designed to fetch information about pages over internet to produce most efficient search results.


Also See:

How to build Search Engine Friendly web page
Understanding HTML for SEO

1 comment: