Abstrat:
Ø A web crawler is a program which automatically traverses the web by downloading documents and following links from page to page
Ø Web crawlers—also known as spiders, robots, wanderers
Ø The behavior of a search engine with web crawler is combination of two policies :
Ø Selection Policy
Ø Politeness policy
Ø
Ø Software Requirements:
| Presentation Layer | HTML, DHTML, JavaScript, XML | |||||||||||||||
| Network Layer | TCP/IP | |||||||||||||||
| Web Server Layer | Tomcat5.5, Servlets, JDBC, JSP, | |||||||||||||||
| Language Specification | Java | |||||||||||||||
| Web Technology | HTML | |||||||||||||||
| Daabases | Oracle8i | |||||||||||||||
| Operating Systems | Windows 2000, Windows NT 4.0, Windows 9x Hardware Requirements
*Or the minimum required by the operating system, whichever is higher |
No comments:
Post a Comment