What is the search engine spider and define its role?


Search engine Robot/Crawler/Spider is an automated script/program which is used by search engines to deliver up-to-date Web search results to the users. The seaarch engine robot visits each web page and finds the available hyperlinks and associated meta information to the list of URL's to be crawled. Further URL's from the list are revisited based on the crawl policy.

Share this thread


comments powered by Disqus


Social Media