⚝
One Hat Cyber Team
⚝
Your IP:
216.73.216.67
Server IP:
50.6.168.112
Server:
Linux server-617809.webnetzimbabwe.com 5.14.0-570.25.1.el9_6.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 9 04:57:09 EDT 2025 x86_64
Server Software:
Apache
PHP Version:
8.4.10
Buat File
|
Buat Folder
Eksekusi
Dir :
~
/
usr
/
lib64
/
python3.9
/
urllib
/
__pycache__
/
View File Name :
robotparser.cpython-39.pyc
a ÕDOgÐ$ ã @ s\ d Z ddlZddlZddlZdgZe dd¡ZG dd„ dƒZG dd„ dƒZ G d d „ d ƒZ dS )a% robotparser.py Copyright (C) 2000 Bastian Kleineidam You can choose between two licenses when using this package: 1) GNU GPLv2 2) PSF license for Python 2.2 The robots.txt Exclusion Protocol is implemented as specified in http://www.robotstxt.org/norobots-rfc.txt é NÚRobotFileParserÚRequestRatezrequests secondsc @ sr e Zd ZdZddd„Zdd„ Zdd„ Zd d „ Zdd„ Zd d„ Z dd„ Z dd„ Zdd„ Zdd„ Z dd„ Zdd„ ZdS )r zs This class provides a set of methods to read, parse and answer questions about a single robots.txt file. Ú c C s2 g | _ g | _d | _d| _d| _| |¡ d| _d S )NFr )ÚentriesÚsitemapsÚ default_entryÚdisallow_allÚ allow_allÚset_urlÚlast_checked©ÚselfÚurl© r ú*/usr/lib64/python3.9/urllib/robotparser.pyÚ__init__ s zRobotFileParser.__init__c C s | j S )z·Returns the time the robots.txt file was last fetched. This is useful for long-running web spiders that need to check for new robots.txt files periodically. )r ©r r r r Úmtime% s zRobotFileParser.mtimec C s ddl }| ¡ | _dS )zYSets the time the robots.txt file was last fetched to the current time. r N)Útimer )r r r r r Úmodified. s zRobotFileParser.modifiedc C s&