"Definitions "
A robot is a program that automatically runs the web structure by retrieving hypertext of a document, and recursively retrieving all documents that are referenced.
Note that "recursive" here does not limit the definition to any specific traversal algorithm, even if a robot applies a heuristic for the selection and the order of documents to visit the site of application through a long period of time, still a robot. Normal
web browsers are not robots, because they are operated by human beings, not automatically retrieve referenced documents (other than inline images).
web robots are sometimes referred to as Web Wanderers, Web Crawlers or spiders. These names are a bit misleading because they give the impression that the software itself moves between sites like a virus, which is not the case, a robot simply visits sites of interest in the documents themselves.
What is an agent? Note that "recursive" here does not limit the definition to any specific traversal algorithm, even if a robot applies a heuristic for the selection and the order of documents to visit the site of application through a long period of time, still a robot. Normal
web browsers are not robots, because they are operated by human beings, not automatically retrieve referenced documents (other than inline images).
web robots are sometimes referred to as Web Wanderers, Web Crawlers or spiders. These names are a bit misleading because they give the impression that the software itself moves between sites like a virus, which is not the case, a robot simply visits sites of interest in the documents themselves.
The word "agent" is used for a lot of meaning in computing these days. Specifically:
Autonomous agents are programs that move between sites, deciding themselves when to move and what to do. These can only travel between special servers and currently not widespread in the Internet.
Intelligent agents are programs that help users with things such as choosing a product, or guide the user through form filling, or even help users find things. These usually little to do with networking. User-agent
is a technical name for programs that perform networking tasks for a user, such as Web user agents such as Netscape Navigator and Microsoft Internet Explorer and e-mail user-agent like Qualcomm Eudora, etc
What is a search engine? Autonomous agents are programs that move between sites, deciding themselves when to move and what to do. These can only travel between special servers and currently not widespread in the Internet.
Intelligent agents are programs that help users with things such as choosing a product, or guide the user through form filling, or even help users find things. These usually little to do with networking. User-agent
is a technical name for programs that perform networking tasks for a user, such as Web user agents such as Netscape Navigator and Microsoft Internet Explorer and e-mail user-agent like Qualcomm Eudora, etc
A search engine is a program searches Through That Some dataset. In the context of the Web, the word "search engine" is MOST Often Used for search forms That search-through databases of HTML documents Gathered by a robot.
What kind of robots are there? Indexing HTML Validation Validation Link
What's New? surveillance, etc..
are some reasons that people think are bad for the web: Some
So no, the robots are not inherently bad, nor inherently brilliant, and the need for careful attention. More information
robot implementations can (and past) overloaded networks and servers. This happens especially with people who are beginning to write a robot, these days there is not enough information about the robots to avoid some of these errors.
The robots are operated by human beings who make mistakes in configuration, or simply do not take into account the consequences of their actions. This means that people should be careful, and the authors of the robot needs to make it difficult for people to make mistakes with negative effects.
web indexing robots throughout the construction of a centralized database of documents that are not well suited millions of documents at millions of locations.
But while most robots are well designed, professionally operated, cause no problems, and providing a valuable service in the absence of large deployment best solutions. So no, the robots are not inherently bad, nor inherently brilliant, and the need for careful attention. More information
0 comments:
Post a Comment