Browse Definitions :
Definition

distributed search

Distributed search is a search engine model in which the tasks of Web crawling, indexing and query processing are distributed among multiple computers and networks.

Originally, most search engines were supported by a single supercomputer . In recent years, however, most have moved to a distributed model. Google search, for example, relies upon thousands of computers crawling the Web from multiple locations all over the world.

In Google's distributed search system, each computer involved in indexing crawls and reviews a portion of the Web, taking a URL and following every link available from it (minus those marked for exclusion). The computer gathers the crawled results from the URLs and sends that information back to a centralized server in compressed format. The centralized server then coordinates that information in a database , along with information from other computers involved in indexing.

When a user types a query into the search field, Google's domain name server ( DNS ) software relays the query to the most logical cluster of computers, based on factors such as its proximity to the user or how busy it is. At the recipient cluster, the Web server software distributes the query to hundreds or thousands of computers to search simultaneously. Hundreds of computers scan the database index to find all relevant records. The index server compiles the results, the document server pulls together the titles and summaries and the page builder creates the search result pages.

Some projects, such as Wikia Search (formerly Grub ) are moving towards an even more decentralized search model. Similarly to distributed computing projects such as [email protected] , many distributed search projects are supported by a network of voluntary users whose computers run client software in the background.

This was last updated in April 2008
SearchNetworking
SearchSecurity
  • man in the browser (MitB)

    Man in the browser (MitB) is a security attack where the perpetrator installs a Trojan horse on the victim's computer that is ...

  • Patch Tuesday

    Patch Tuesday is the unofficial name of Microsoft's monthly scheduled release of security fixes for the Windows operating system ...

  • parameter tampering

    Parameter tampering is a type of web-based cyber attack in which certain parameters in a URL are changed without a user's ...

SearchCIO
  • chief procurement officer (CPO)

    The chief procurement officer, or CPO, leads an organization's procurement department and oversees the acquisitions of goods and ...

  • Lean Six Sigma

    Lean Six Sigma is a data-driven approach to improving efficiency, customer satisfaction and profits.

  • change management

    Change management is a systematic approach to dealing with the transition or transformation of an organization's goals, processes...

SearchHRSoftware
SearchCustomerExperience
  • clickstream data (clickstream analytics)

    Clickstream data and clickstream analytics are the processes involved in collecting, analyzing and reporting aggregate data about...

  • neuromarketing

    Neuromarketing is the study of how people's brains respond to advertising and other brand-related messages by scientifically ...

  • contextual marketing

    Contextual marketing is an online marketing strategy model in which people are served with targeted advertising based on their ...

Close