WWW::RobotRules - Online Linux Manual PageSection : 3
Updated : 2022-01-21
Source : perl v5.34.0
Note : User Contributed Perl Documentation

NAMEWWW::RobotRules − database of robots.txt−derived permissions

SYNOPSIS​ use WWW::RobotRules; ​ my $rules = WWW::RobotRules−>new('MOMspider/1.0'); ​ ​ use LWP::Simple qw(get); ​ ​ { ​ my $url = "http://some.place/robots.txt"; ​ my $robots_txt = get $url; ​ $rules−>parse($url, $robots_txt) if defined $robots_txt; ​ } ​ ​ { ​ my $url = "http://some.other.place/robots.txt"; ​ my $robots_txt = get $url; ​ $rules−>parse($url, $robots_txt) if defined $robots_txt; ​ } ​ ​ # Now we can check if a URL is valid for those servers ​ # whose "robots.txt" files we've gotten and parsed: ​ if($rules−>allowed($url)) { ​ $c = get $url; ​ ... ​ }

DESCRIPTIONThis module parses /robots.txt files as specified in ​A Standard for Robot Exclusion, at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed ​/robots.txt files on any number of hosts. The following methods are provided: $rules = WWW::RobotRules−>new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules>parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules>allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules>agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache.

ROBOTS.TXTThe format and semantics of the /robots.txt file are as follows (this is an edited abstract of <http://www.robotstxt.org/wc/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field−name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field−names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be the first field in a new record. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved Unrecognized records are ignored.

ROBOTS.TXT EXAMPLESThe following example /robots.txt file specifies that no robots should visit any URL starting with /cyberworld/map/ or /tmp/: ​ User−agent: * ​ Disallow: /cyberworld/map/ # This is an infinite virtual URL space ​ Disallow: /tmp/ # these will soon disappear This example /robots.txt file specifies that no robots should visit any URL starting with /cyberworld/map/, except the robot called ​cybermapper: ​ User−agent: * ​ Disallow: /cyberworld/map/ # This is an infinite virtual URL space ​ ​ # Cybermapper knows where to go. ​ User−agent: cybermapper ​ Disallow: This example indicates that no robots should visit this site further: ​ # go away ​ User−agent: * ​ Disallow: / This is an example of a malformed robots.txt file. ​ # robots.txt for ancientcastle.example.com ​ # I've locked myself away. ​ User−agent: * ​ Disallow: / ​ # The castle is your home now, so you can go anywhere you like. ​ User−agent: Belle ​ Disallow: /west−wing/ # except the west wing! ​ # It's good to be the Prince... ​ User−agent: Beast ​ Disallow: This file is missing the required blank lines between records. However, the intention is clear.

SEE ALSOLWP::RobotUA, WWW::RobotRules::AnyDBM_File

COPYRIGHT​ Copyright 1995−2009, Gisle Aas ​ Copyright 1995, Martijn Koster This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself.
0
Johanes Gumabo
Data Size   :   15,350 byte
man-WWW::RobotRules.3pmBuild   :   2024-12-05, 20:55   :  
Visitor Screen   :   x
Visitor Counter ( page / site )   :   2 / 164,605
Visitor ID   :     :  
Visitor IP   :   18.226.214.1   :  
Visitor Provider   :   AMAZON-02   :  
Provider Position ( lat x lon )   :   39.962500 x -83.006100   :   x
Provider Accuracy Radius ( km )   :   1000   :  
Provider City   :   Columbus   :  
Provider Province   :   Ohio ,   :   ,
Provider Country   :   United States   :  
Provider Continent   :   North America   :  
Visitor Recorder   :   Version   :  
Visitor Recorder   :   Library   :  
Online Linux Manual Page   :   Version   :   Online Linux Manual Page - Fedora.40 - march=x86-64 - mtune=generic - 24.12.05
Online Linux Manual Page   :   Library   :   lib_c - 24.10.03 - march=x86-64 - mtune=generic - Fedora.40
Online Linux Manual Page   :   Library   :   lib_m - 24.10.03 - march=x86-64 - mtune=generic - Fedora.40
Data Base   :   Version   :   Online Linux Manual Page Database - 24.04.13 - march=x86-64 - mtune=generic - fedora-38
Data Base   :   Library   :   lib_c - 23.02.07 - march=x86-64 - mtune=generic - fedora.36

Very long time ago, I have the best tutor, Wenzel Svojanovsky . If someone knows the email address of Wenzel Svojanovsky , please send an email to johanes_gumabo@yahoo.co.id .
If error, please print screen and send to johanes_gumabo@yahoo.co.id
Under development. Support me via PayPal.

ERROR : Need New Coding :         (parse_manual_page_|249|WWW::RobotRules.3pm|36/37|el══─{─══.|.el══─{─══. ds -- \|\(em\| )         (htmlprn|149|WWW::RobotRules.3pm|36/37|.el══─{─══. ds --  —  |.el══─{─══. ds -- \|\(em\| )         (parse_manual_page_|249|WWW::RobotRules.3pm|43|br══─}─══|'br══─}─══ )         (htmlprn|149|WWW::RobotRules.3pm|43|'br══─}─══ |'br══─}─══ )