8000 Urllib by entwanne · Pull Request #57 · python/python-docs-fr · GitHub
[go: up one dir, main page]

Skip to content

Urllib #57

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Oct 26, 2017
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Complete library/urllib.robotparser.po
  • Loading branch information
entwanne authored and Antoine Rozo committed Oct 25, 2017
commit ce97eff9c1fce62f257254b2f76afd3584e1f178
42 changes: 35 additions & 7 deletions library/urllib.robotparser.po
Original file line number Diff line number Diff line change
Expand Up @@ -3,27 +3,27 @@
# This file is distributed under the same license as the Python package.
# FIRST AUTHOR <EMAIL@ADDRESS>, YEAR.
#
#, fuzzy
msgid ""
8000 msgstr ""
"Project-Id-Version: Python 3.6\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2017-04-02 22:11+0200\n"
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n"
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n"
"PO-Revision-Date: 2017-10-19 17:28+0100\n"
"Last-Translator: \n"
"Language-Team: LANGUAGE <LL@li.org>\n"
"Language: fr\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Generator: Poedit 1.6.10\n"

#: ../Doc/library/urllib.robotparser.rst:2
msgid ":mod:`urllib.robotparser` --- Parser for robots.txt"
msgstr ""
msgstr ":mod:`urllib.robotparser` --- Analyseur de fichiers *robots.txt*"

#: ../Doc/library/urllib.robotparser.rst:10
msgid "**Source code:** :source:`Lib/urllib/robotparser.py`"
msgstr ""
msgstr "**Code source :** :source:`Lib/urllib/robotparser.py`"

#: ../Doc/library/urllib.robotparser.rst:20
msgid ""
Expand All @@ -33,42 +33,59 @@ msgid ""
"on the structure of :file:`robots.txt` files, see http://www.robotstxt.org/"
"orig.html."
msgstr ""
"Ce module fournit une simple classe, :class:`RobotFileParser`, qui permet de "
"savoir si un *user-agent* particulier peut accéder à une URL du site web qui "
"a publié ce fichier :file:`robots.txt`. Pour plus de détails sur la "
"structure des fichiers :file:`robots.txt`, voir http://www.robotstxt.org/"
"orig.html."

#: ../Doc/library/urllib.robotparser.rst:28
msgid ""
"This class provides methods to read, parse and answer questions about the :"
"file:`robots.txt` file at *url*."
msgstr ""
"Cette classe fournit des méthodes pour lire, analyser et répondre aux "
"questions à propos du fichier :file:`robots.txt` disponible à l'adresse "
"*url*."

#: ../Doc/library/urllib.robotparser.rst:33
msgid "Sets the URL referring to a :file:`robots.txt` file."
msgstr ""
msgstr "Modifie l'URL référençant le fichier :file:`robots.txt`."

#: ../Doc/library/urllib.robotparser.rst:37
msgid "Reads the :file:`robots.txt` URL and feeds it to the parser."
msgstr ""
"Lit le fichier :file:`robots.txt` depuis son URL et envoie le contenu à "
"l'analyseur."

#: ../Doc/library/urllib.robotparser.rst:41
msgid "Parses the lines argument."
msgstr ""
msgstr "Analyse les lignes données en argument."

#: ../Doc/library/urllib.robotparser.rst:45
msgid ""
"Returns ``True`` if the *useragent* is allowed to fetch the *url* according "
"to the rules contained in the parsed :file:`robots.txt` file."
msgstr ""
"Renvoie ``True`` si *useragent* est autorisé à accéder à *url* selon les "
"règles contenues dans le fichier :file:`robots.txt` analysé."

#: ../Doc/library/urllib.robotparser.rst:51
msgid ""
"Returns the time the ``robots.txt`` file was last fetched. This is useful "
"for long-running web spiders that need to check for new ``robots.txt`` files "
"periodically."
msgstr ""
"Renvoie le temps auquel le fichier ``robots.txt`` a été téléchargé pour la "
"dernière fois. Cela est utile pour des *web spiders* de longue durée qui "
"doivent vérifier périodiquement si le fichier est mis à jour."

#: ../Doc/library/urllib.robotparser.rst:57
msgid ""
"Sets the time the ``robots.txt`` file was last fetched to the current time."
msgstr ""
"Indique que le fichier ``robots.txt`` a été téléchargé pour la dernière fois "
"au temps courant."

#: ../Doc/library/urllib.robotparser.rst:62
msgid ""
Expand All @@ -77,6 +94,10 @@ msgid ""
"apply to the *useragent* specified or the ``robots.txt`` entry for this "
"parameter has invalid syntax, return ``None``."
msgstr ""
"Renvoie la valeur du paramètre ``Crawl-delay`` du ``robots.txt`` pour le "
"*useragent* en question. S'il n'y a pas de tel paramètre ou qu'il ne "
"s'applique pas au *useragent* spécifié ou si l'entrée du ``robots.txt`` pour "
"ce paramètre a une syntaxe invalide, renvoie ``None``."

#: ../Doc/library/urllib.robotparser.rst:71
msgid ""
Expand All @@ -86,9 +107,16 @@ msgid ""
"specified or the ``robots.txt`` entry for this parameter has invalid syntax, "
"return ``None``."
msgstr ""
"Renvoie le contenu du paramètre ``Request-rate`` du ``robots.txt`` sous la "
"forme d'un :func:`~collections.namedtuple` ``(requests, seconds)``. S'il "
"n'y a pas de tel paramètre ou qu'il ne s'applique pas au *useragent* "
"spécifié ou si l'entrée du ``robots.txt`` pour ce paramètre a une syntaxe "
"invalide, renvoie ``None``."

#: ../Doc/library/urllib.robotparser.rst:80
msgid ""
"The following example demonstrates basic use of the :class:`RobotFileParser` "
"class::"
msgstr ""
"L'exemple suivant présente une utilisation basique de la classe :class:"
"`RobotFileParser` : ::"
0