Explain about different types of search engines with examples?

 

ANS… The purpose of a search engine is to extract requested information from the huge database of resources available on the internet. Search engines become an important day to day tool for finding the required information without knowing where exactly it is stored. There are different types of search engines to get the information you are looking for.

Search engines are classified into the following three categories based on how it works.

  1. Crawler based search engines
  2. Human-powered directories
  3. Hybrid search engines
  4. Other special search engines

1. Crawler Based Search Engines

All crawler-based search engines use a crawler or bot or spider for crawling and indexing new content to the search database. There are four basic steps, every crawler-based search engines follow before displaying any sites in the search results.

  1. Crawling
  2. Indexing
  3. Calculating Relevancy
  4. Retrieving the Result

1.1. Crawling

Search engines crawl the whole web to fetch the web pages available. A piece of software called crawler or bot or spider performs the crawling of the entire web. The crawling frequency depends on the search engine and it may take a few days between crawls. This is the reason sometimes you can see your old or deleted page content is showing in the search results. The search results will show the newly updated content, once the search engines crawl your site again.

1.2. Indexing

Indexing is the next step after crawling which is a process of identifying the words and expressions that best describe the page. The identified words are referred to as keywords and the page is assigned to the identified keywords. Sometimes when the crawler does not understand the meaning of your page, your site may rank lower on the search results. Here you need to optimize your pages for search engine crawlers to make sure the content is easily understandable. Once the crawlers pickup correct keywords your page will be assigned to those keywords and rank high on search results.

1.3. Calculating Relevancy

The search engine compares the search string in the search request with the indexed pages from the database. Since more than one page likely contains the search string, the search engine starts calculating the relevancy of each of the pages in its index with the search string.

There are various algorithms to calculate relevancy. Each of these algorithms has different relative weights for common factors like keyword density, links, or meta tags. That is why different search engines give different search results pages for the same search string. It is a known fact that all major search engines periodically change their algorithms. If you want to keep your site at the top, you also need to adapt your pages to the latest changes. This is one reason to devote permanent efforts to SEO if you like to be at the top.

1.4. Retrieving Results

The last step in search engines’ activity is retrieving the results. It is simply displaying them in the browser in order. Search engines sort the endless pages of search results in the order of most relevant to the least relevant sites.

Examples of Crawler Based Search Engines

Most of the popular search engines are crawler-based search engines and use the above technology to display search results. Example of crawler-based search engines:

Besides these popular search engines, there are many other crawler-based search engines available like DuckDuckGo, AOL, and Ask.

2. Human Powered Directories

Human-powered directories also referred to as open directory system depends on human-based activities for listings. Below is how the indexing in human-powered directories works:

  • The site owner submits a short description of the site to the directory along with the category it is to be listed.
  • Submitted site is then manually reviewed and added in the appropriate category or rejected for listing.
  • Keywords entered in a search box will be matched with the description of the sites. This means the changes made to the content of web pages are not taken into consideration as it is only the description that matters.
  • A good site with good content is more likely to be reviewed for free compared to a site with poor content.

Yahoo! Directory and DMOZ were perfect examples of human-powered directories. Unfortunately, automated search engines like Google wiped all those human-powered directory-style search engines out of the web.

3. Hybrid Search Engines

Hybrid Search Engines use both crawlers based and manual indexing for listing the sites in search results. Most of the crawler-based search engines like Google uses crawlers as a primary mechanism and human-powered directories as a secondary mechanism. For example, Google may take the description of a webpage from human-powered directories and show in the search results. As human-powered directories are disappearing, hybrid types are becoming more and more crawler-based search engines.

But still, there are manual filtering of search result happens to remove the copied and spammy sites. When a site is being identified for spammy activities, the website owner needs to take corrective action and resubmit the site to search engines. The experts do a manual review of the submitted site before including it again in the search results. In this manner, though the crawlers control the processes, the control is manual to monitor and show the search results naturally.

4. Other Types of Search Engines

Besides the above three major types, search engines can be classified into many other categories depending upon the usage. Below are some of the examples:

  • Search engines have different types of bots for exclusively displaying images, videos, news, products, and local listings. For example, Google News page can be used to search only news from different newspapers.
  • Some of the search engines like Dogpile collects meta information of the pages from other search engines and directories to display in the search results. These types of search engines are called metasearch engines.
  • Semantic search engines like Swoogle provide accurate search results on a specific area by understanding the contextual meaning of the search queries.

Read More Post…

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top