AN ADAPTIVE MODEL FOR OPTIMIZING PERFORMANCE OF AN INCREMENTAL WEB CRAWLER

This paper outlines the design of a web crawler implemented for IBM
Almaden’s WebFountain project and describes an optimization model for controlling the
crawl strategy. This crawler is scalable and incremental. The model makes no assumptions
about the statistical behaviour of web page changes, but rather uses an adaptive approach to
maintain data on actual change rates which are in turn used as inputs for the optimization.
Computational results with simulated but realistic data show that there is no ‘magic bullet’
-different, but equally plausible, objectives lead to conflicting ‘optimal’ strategies. However,
we find that there are compromise objectives which lead to good strategies that are robust
against a number of criteria.

By: Jenny Edwards,Kevin McCurley, John Tomlin,

Published in: RJ10210 in 2001

LIMITED DISTRIBUTION NOTICE:

This Research Report is available. This report has been submitted for publication outside of IBM and will probably be copyrighted if accepted for publication. It has been issued as a Research Report for early dissemination of its contents. In view of the transfer of copyright to the outside publisher, its distribution outside of IBM prior to publication should be limited to peer communications and specific requests. After outside publication, requests should be filled only by reprints or legally obtained copies of the article (e.g., payment of royalties). I have read and understand this notice and am a member of the scientific community outside or inside of IBM seeking a single copy only.

RJ10210.pdf

Questions about this service can be mailed to reports@us.ibm.com .