diff --git a/.github/ISSUE_TEMPLATE/Bug_report.md b/.github/ISSUE_TEMPLATE/Bug_report.md
new file mode 100644
index 0000000000..5133965fbe
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/Bug_report.md
@@ -0,0 +1,34 @@
+---
+name: Bug report
+about: Create a report to help us improve
+
+---
+
+Summary.
+
+## Expected Result
+
+What you expected.
+
+## Actual Result
+
+What happened instead.
+
+## Reproduction Steps
+
+```python
+import requests
+
+```
+
+## System Information
+
+ $ python -m requests.help
+
+```
+
+```
+
+This command is only available on Requests v2.16.4 and greater. Otherwise,
+please provide some basic information about your system (Python version,
+operating system, &c).
diff --git a/.github/ISSUE_TEMPLATE/Custom.md b/.github/ISSUE_TEMPLATE/Custom.md
new file mode 100644
index 0000000000..19291c1500
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/Custom.md
@@ -0,0 +1,7 @@
+---
+name: Request for Help
+about: Guidance on using Requests.
+
+---
+
+Please refer to our [Stack Overflow tag](https://stackoverflow.com/questions/tagged/python-requests) for guidance.
diff --git a/.github/ISSUE_TEMPLATE/Feature_request.md b/.github/ISSUE_TEMPLATE/Feature_request.md
new file mode 100644
index 0000000000..dcf6a445fb
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/Feature_request.md
@@ -0,0 +1,7 @@
+---
+name: Feature request
+about: Suggest an idea for this project
+
+---
+
+Requests is not accepting feature requests at this time.
diff --git a/.gitignore b/.gitignore
index 19ebfd7976..cd0c32e95c 100644
--- a/.gitignore
+++ b/.gitignore
@@ -21,3 +21,5 @@ t.py
t2.py
dist
+
+/.mypy_cache/
diff --git a/.travis.yml b/.travis.yml
index 395f573120..efb75ddec2 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -1,14 +1,5 @@
sudo: false
language: python
-python:
- - "2.6"
- - "2.7"
- - "3.4"
- - "3.5"
- - "3.6"
- # - "3.7-dev"
- # - "pypy" -- appears to hang
- # - "pypy3"
# command to install dependencies
install: "make"
# command to run tests
@@ -21,11 +12,31 @@ jobs:
include:
- stage: test
script:
- - |
- if [[ "$TRAVIS_PYTHON_VERSION" != "2.6" ]] ; then make test-readme; fi
+ - make test-readme
- make ci
+ python: '2.7'
+ - stage: test
+ script:
+ - make test-readme
+ - make ci
+ python: '3.4'
+ - stage: test
+ script:
+ - make test-readme
+ - make ci
+ python: '3.5'
+ - stage: test
+ script:
+ - make test-readme
+ - make ci
+ python: '3.6'
+ - stage: test
+ script:
+ - make test-readme
+ - make ci
+ python: '3.7'
+ dist: xenial
+ sudo: true
- stage: coverage
python: 3.6
script: codecov
-
-
diff --git a/AUTHORS.rst b/AUTHORS.rst
index b87dc4439d..f0ee696b25 100644
--- a/AUTHORS.rst
+++ b/AUTHORS.rst
@@ -1,13 +1,17 @@
Requests is written and maintained by Kenneth Reitz and
various contributors:
-Keepers of the Four Crystals
-````````````````````````````
+Keepers of the Crystals
+```````````````````````
- Kenneth Reitz `@kennethreitz `_, Keeper of the Master Crystal.
+- Ian Cordasco `@sigmavirus24 `_.
+- Nate Prewitt `@nateprewitt `_.
+
+Previous Keepers of Crystals
+````````````````````````````
+
- Cory Benfield `@lukasa `_
-- Ian Cordasco `@sigmavirus24 `_
-- Nate Prewitt `@nateprewitt `_
Patches and Suggestions
@@ -121,7 +125,7 @@ Patches and Suggestions
- Bryce Boe (`@bboe `_)
- Colin Dunklau (`@cdunklau `_)
- Bob Carroll (`@rcarz `_)
-- Hugo Osvaldo Barrera (`@hobarrera `_)
+- Hugo Osvaldo Barrera (`@hobarrera `_)
- Ćukasz Langa
- Dave Shawley
- James Clarke (`@jam `_)
@@ -178,3 +182,10 @@ Patches and Suggestions
- Ryan Pineo (`@ryanpineo `_)
- Ed Morley (`@edmorley `_)
- Matt Liu (`@mlcrazy `_)
+- Taylor Hoff (`@PrimordialHelios `_)
+- Arthur Vigil (`@ahvigil `_)
+- Nehal J Wani (`@nehaljwani `_)
+- Demetrios Bairaktaris (`@DemetriosBairaktaris `_)
+- Darren Dormer (`@ddormer `_)
+- Rajiv Mayani (`@mayani `_)
+- Antti Kaihola (`@akaihola `_)
diff --git a/HISTORY.md b/HISTORY.md
new file mode 100644
index 0000000000..09a09eee07
--- /dev/null
+++ b/HISTORY.md
@@ -0,0 +1,1623 @@
+Release History
+===============
+
+dev
+---
+
+**Bugfixes**
+
+- \[Short description of non-trivial change.\]
+
+2.20.0 (2018-10-18)
+-------------------
+
+**Bugfixes**
+
+- Content-Type header parsing is now case-insensitive (e.g.
+ charset=utf8 v Charset=utf8).
+- Fixed exception leak where certain redirect urls would raise
+ uncaught urllib3 exceptions.
+- Requests removes Authorization header from requests redirected
+ from https to http on the same hostname. (CVE-2018-18074)
+- `should_bypass_proxies` now handles URIs without hostnames (e.g.
+ files).
+
+**Dependencies**
+
+- Requests now supports urllib3 v1.24.
+
+**Deprecations**
+
+- Requests has officially stopped support for Python 2.6.
+
+2.19.1 (2018-06-14)
+-------------------
+
+**Bugfixes**
+
+- Fixed issue where status\_codes.py's `init` function failed trying
+ to append to a `__doc__` value of `None`.
+
+2.19.0 (2018-06-12)
+-------------------
+
+**Improvements**
+
+- Warn user about possible slowdown when using cryptography version
+ < 1.3.4
+- Check for invalid host in proxy URL, before forwarding request to
+ adapter.
+- Fragments are now properly maintained across redirects. (RFC7231
+ 7.1.2)
+- Removed use of cgi module to expedite library load time.
+- Added support for SHA-256 and SHA-512 digest auth algorithms.
+- Minor performance improvement to `Request.content`.
+- Migrate to using collections.abc for 3.7 compatibility.
+
+**Bugfixes**
+
+- Parsing empty `Link` headers with `parse_header_links()` no longer
+ return one bogus entry.
+- Fixed issue where loading the default certificate bundle from a zip
+ archive would raise an `IOError`.
+- Fixed issue with unexpected `ImportError` on windows system which do
+ not support `winreg` module.
+- DNS resolution in proxy bypass no longer includes the username and
+ password in the request. This also fixes the issue of DNS queries
+ failing on macOS.
+- Properly normalize adapter prefixes for url comparison.
+- Passing `None` as a file pointer to the `files` param no longer
+ raises an exception.
+- Calling `copy` on a `RequestsCookieJar` will now preserve the cookie
+ policy correctly.
+
+**Dependencies**
+
+- We now support idna v2.7.
+- We now support urllib3 v1.23.
+
+2.18.4 (2017-08-15)
+-------------------
+
+**Improvements**
+
+- Error messages for invalid headers now include the header name for
+ easier debugging
+
+**Dependencies**
+
+- We now support idna v2.6.
+
+2.18.3 (2017-08-02)
+-------------------
+
+**Improvements**
+
+- Running `$ python -m requests.help` now includes the installed
+ version of idna.
+
+**Bugfixes**
+
+- Fixed issue where Requests would raise `ConnectionError` instead of
+ `SSLError` when encountering SSL problems when using urllib3 v1.22.
+
+2.18.2 (2017-07-25)
+-------------------
+
+**Bugfixes**
+
+- `requests.help` no longer fails on Python 2.6 due to the absence of
+ `ssl.OPENSSL_VERSION_NUMBER`.
+
+**Dependencies**
+
+- We now support urllib3 v1.22.
+
+2.18.1 (2017-06-14)
+-------------------
+
+**Bugfixes**
+
+- Fix an error in the packaging whereby the `*.whl` contained
+ incorrect data that regressed the fix in v2.17.3.
+
+2.18.0 (2017-06-14)
+-------------------
+
+**Improvements**
+
+- `Response` is now a context manager, so can be used directly in a
+ `with` statement without first having to be wrapped by
+ `contextlib.closing()`.
+
+**Bugfixes**
+
+- Resolve installation failure if multiprocessing is not available
+- Resolve tests crash if multiprocessing is not able to determine the
+ number of CPU cores
+- Resolve error swallowing in utils set\_environ generator
+
+2.17.3 (2017-05-29)
+-------------------
+
+**Improvements**
+
+- Improved `packages` namespace identity support, for monkeypatching
+ libraries.
+
+2.17.2 (2017-05-29)
+-------------------
+
+**Improvements**
+
+- Improved `packages` namespace identity support, for monkeypatching
+ libraries.
+
+2.17.1 (2017-05-29)
+-------------------
+
+**Improvements**
+
+- Improved `packages` namespace identity support, for monkeypatching
+ libraries.
+
+2.17.0 (2017-05-29)
+-------------------
+
+**Improvements**
+
+- Removal of the 301 redirect cache. This improves thread-safety.
+
+2.16.5 (2017-05-28)
+-------------------
+
+- Improvements to `$ python -m requests.help`.
+
+2.16.4 (2017-05-27)
+-------------------
+
+- Introduction of the `$ python -m requests.help` command, for
+ debugging with maintainers!
+
+2.16.3 (2017-05-27)
+-------------------
+
+- Further restored the `requests.packages` namespace for compatibility
+ reasons.
+
+2.16.2 (2017-05-27)
+-------------------
+
+- Further restored the `requests.packages` namespace for compatibility
+ reasons.
+
+No code modification (noted below) should be necessary any longer.
+
+2.16.1 (2017-05-27)
+-------------------
+
+- Restored the `requests.packages` namespace for compatibility
+ reasons.
+- Bugfix for `urllib3` version parsing.
+
+**Note**: code that was written to import against the
+`requests.packages` namespace previously will have to import code that
+rests at this module-level now.
+
+For example:
+
+ from requests.packages.urllib3.poolmanager import PoolManager
+
+Will need to be re-written to be:
+
+ from requests.packages import urllib3
+ urllib3.poolmanager.PoolManager
+
+Or, even better:
+
+ from urllib3.poolmanager import PoolManager
+
+2.16.0 (2017-05-26)
+-------------------
+
+- Unvendor ALL the things!
+
+2.15.1 (2017-05-26)
+-------------------
+
+- Everyone makes mistakes.
+
+2.15.0 (2017-05-26)
+-------------------
+
+**Improvements**
+
+- Introduction of the `Response.next` property, for getting the next
+ `PreparedResponse` from a redirect chain (when
+ `allow_redirects=False`).
+- Internal refactoring of `__version__` module.
+
+**Bugfixes**
+
+- Restored once-optional parameter for
+ `requests.utils.get_environ_proxies()`.
+
+2.14.2 (2017-05-10)
+-------------------
+
+**Bugfixes**
+
+- Changed a less-than to an equal-to and an or in the dependency
+ markers to widen compatibility with older setuptools releases.
+
+2.14.1 (2017-05-09)
+-------------------
+
+**Bugfixes**
+
+- Changed the dependency markers to widen compatibility with older pip
+ releases.
+
+2.14.0 (2017-05-09)
+-------------------
+
+**Improvements**
+
+- It is now possible to pass `no_proxy` as a key to the `proxies`
+ dictionary to provide handling similar to the `NO_PROXY` environment
+ variable.
+- When users provide invalid paths to certificate bundle files or
+ directories Requests now raises `IOError`, rather than failing at
+ the time of the HTTPS request with a fairly inscrutable certificate
+ validation error.
+- The behavior of `SessionRedirectMixin` was slightly altered.
+ `resolve_redirects` will now detect a redirect by calling
+ `get_redirect_target(response)` instead of directly querying
+ `Response.is_redirect` and `Response.headers['location']`. Advanced
+ users will be able to process malformed redirects more easily.
+- Changed the internal calculation of elapsed request time to have
+ higher resolution on Windows.
+- Added `win_inet_pton` as conditional dependency for the `[socks]`
+ extra on Windows with Python 2.7.
+- Changed the proxy bypass implementation on Windows: the proxy bypass
+ check doesn't use forward and reverse DNS requests anymore
+- URLs with schemes that begin with `http` but are not `http` or
+ `https` no longer have their host parts forced to lowercase.
+
+**Bugfixes**
+
+- Much improved handling of non-ASCII `Location` header values in
+ redirects. Fewer `UnicodeDecodeErrors` are encountered on Python 2,
+ and Python 3 now correctly understands that Latin-1 is unlikely to
+ be the correct encoding.
+- If an attempt to `seek` file to find out its length fails, we now
+ appropriately handle that by aborting our content-length
+ calculations.
+- Restricted `HTTPDigestAuth` to only respond to auth challenges made
+ on 4XX responses, rather than to all auth challenges.
+- Fixed some code that was firing `DeprecationWarning` on Python 3.6.
+- The dismayed person emoticon (`/o\\`) no longer has a big head. I'm
+ sure this is what you were all worrying about most.
+
+**Miscellaneous**
+
+- Updated bundled urllib3 to v1.21.1.
+- Updated bundled chardet to v3.0.2.
+- Updated bundled idna to v2.5.
+- Updated bundled certifi to 2017.4.17.
+
+2.13.0 (2017-01-24)
+-------------------
+
+**Features**
+
+- Only load the `idna` library when we've determined we need it. This
+ will save some memory for users.
+
+**Miscellaneous**
+
+- Updated bundled urllib3 to 1.20.
+- Updated bundled idna to 2.2.
+
+2.12.5 (2017-01-18)
+-------------------
+
+**Bugfixes**
+
+- Fixed an issue with JSON encoding detection, specifically detecting
+ big-endian UTF-32 with BOM.
+
+2.12.4 (2016-12-14)
+-------------------
+
+**Bugfixes**
+
+- Fixed regression from 2.12.2 where non-string types were rejected in
+ the basic auth parameters. While support for this behaviour has been
+ readded, the behaviour is deprecated and will be removed in the
+ future.
+
+2.12.3 (2016-12-01)
+-------------------
+
+**Bugfixes**
+
+- Fixed regression from v2.12.1 for URLs with schemes that begin with
+ "http". These URLs have historically been processed as though they
+ were HTTP-schemed URLs, and so have had parameters added. This was
+ removed in v2.12.2 in an overzealous attempt to resolve problems
+ with IDNA-encoding those URLs. This change was reverted: the other
+ fixes for IDNA-encoding have been judged to be sufficient to return
+ to the behaviour Requests had before v2.12.0.
+
+2.12.2 (2016-11-30)
+-------------------
+
+**Bugfixes**
+
+- Fixed several issues with IDNA-encoding URLs that are technically
+ invalid but which are widely accepted. Requests will now attempt to
+ IDNA-encode a URL if it can but, if it fails, and the host contains
+ only ASCII characters, it will be passed through optimistically.
+ This will allow users to opt-in to using IDNA2003 themselves if they
+ want to, and will also allow technically invalid but still common
+ hostnames.
+- Fixed an issue where URLs with leading whitespace would raise
+ `InvalidSchema` errors.
+- Fixed an issue where some URLs without the HTTP or HTTPS schemes
+ would still have HTTP URL preparation applied to them.
+- Fixed an issue where Unicode strings could not be used in basic
+ auth.
+- Fixed an issue encountered by some Requests plugins where
+ constructing a Response object would cause `Response.content` to
+ raise an `AttributeError`.
+
+2.12.1 (2016-11-16)
+-------------------
+
+**Bugfixes**
+
+- Updated setuptools 'security' extra for the new PyOpenSSL backend in
+ urllib3.
+
+**Miscellaneous**
+
+- Updated bundled urllib3 to 1.19.1.
+
+2.12.0 (2016-11-15)
+-------------------
+
+**Improvements**
+
+- Updated support for internationalized domain names from IDNA2003 to
+ IDNA2008. This updated support is required for several forms of IDNs
+ and is mandatory for .de domains.
+- Much improved heuristics for guessing content lengths: Requests will
+ no longer read an entire `StringIO` into memory.
+- Much improved logic for recalculating `Content-Length` headers for
+ `PreparedRequest` objects.
+- Improved tolerance for file-like objects that have no `tell` method
+ but do have a `seek` method.
+- Anything that is a subclass of `Mapping` is now treated like a
+ dictionary by the `data=` keyword argument.
+- Requests now tolerates empty passwords in proxy credentials, rather
+ than stripping the credentials.
+- If a request is made with a file-like object as the body and that
+ request is redirected with a 307 or 308 status code, Requests will
+ now attempt to rewind the body object so it can be replayed.
+
+**Bugfixes**
+
+- When calling `response.close`, the call to `close` will be
+ propagated through to non-urllib3 backends.
+- Fixed issue where the `ALL_PROXY` environment variable would be
+ preferred over scheme-specific variables like `HTTP_PROXY`.
+- Fixed issue where non-UTF8 reason phrases got severely mangled by
+ falling back to decoding using ISO 8859-1 instead.
+- Fixed a bug where Requests would not correctly correlate cookies set
+ when using custom Host headers if those Host headers did not use the
+ native string type for the platform.
+
+**Miscellaneous**
+
+- Updated bundled urllib3 to 1.19.
+- Updated bundled certifi certs to 2016.09.26.
+
+2.11.1 (2016-08-17)
+-------------------
+
+**Bugfixes**
+
+- Fixed a bug when using `iter_content` with `decode_unicode=True` for
+ streamed bodies would raise `AttributeError`. This bug was
+ introduced in 2.11.
+- Strip Content-Type and Transfer-Encoding headers from the header
+ block when following a redirect that transforms the verb from
+ POST/PUT to GET.
+
+2.11.0 (2016-08-08)
+-------------------
+
+**Improvements**
+
+- Added support for the `ALL_PROXY` environment variable.
+- Reject header values that contain leading whitespace or newline
+ characters to reduce risk of header smuggling.
+
+**Bugfixes**
+
+- Fixed occasional `TypeError` when attempting to decode a JSON
+ response that occurred in an error case. Now correctly returns a
+ `ValueError`.
+- Requests would incorrectly ignore a non-CIDR IP address in the
+ `NO_PROXY` environment variables: Requests now treats it as a
+ specific IP.
+- Fixed a bug when sending JSON data that could cause us to encounter
+ obscure OpenSSL errors in certain network conditions (yes, really).
+- Added type checks to ensure that `iter_content` only accepts
+ integers and `None` for chunk sizes.
+- Fixed issue where responses whose body had not been fully consumed
+ would have the underlying connection closed but not returned to the
+ connection pool, which could cause Requests to hang in situations
+ where the `HTTPAdapter` had been configured to use a blocking
+ connection pool.
+
+**Miscellaneous**
+
+- Updated bundled urllib3 to 1.16.
+- Some previous releases accidentally accepted non-strings as
+ acceptable header values. This release does not.
+
+2.10.0 (2016-04-29)
+-------------------
+
+**New Features**
+
+- SOCKS Proxy Support! (requires PySocks;
+ `$ pip install requests[socks]`)
+
+**Miscellaneous**
+
+- Updated bundled urllib3 to 1.15.1.
+
+2.9.2 (2016-04-29)
+------------------
+
+**Improvements**
+
+- Change built-in CaseInsensitiveDict (used for headers) to use
+ OrderedDict as its underlying datastore.
+
+**Bugfixes**
+
+- Don't use redirect\_cache if allow\_redirects=False
+- When passed objects that throw exceptions from `tell()`, send them
+ via chunked transfer encoding instead of failing.
+- Raise a ProxyError for proxy related connection issues.
+
+2.9.1 (2015-12-21)
+------------------
+
+**Bugfixes**
+
+- Resolve regression introduced in 2.9.0 that made it impossible to
+ send binary strings as bodies in Python 3.
+- Fixed errors when calculating cookie expiration dates in certain
+ locales.
+
+**Miscellaneous**
+
+- Updated bundled urllib3 to 1.13.1.
+
+2.9.0 (2015-12-15)
+------------------
+
+**Minor Improvements** (Backwards compatible)
+
+- The `verify` keyword argument now supports being passed a path to a
+ directory of CA certificates, not just a single-file bundle.
+- Warnings are now emitted when sending files opened in text mode.
+- Added the 511 Network Authentication Required status code to the
+ status code registry.
+
+**Bugfixes**
+
+- For file-like objects that are not seeked to the very beginning, we
+ now send the content length for the number of bytes we will actually
+ read, rather than the total size of the file, allowing partial file
+ uploads.
+- When uploading file-like objects, if they are empty or have no
+ obvious content length we set `Transfer-Encoding: chunked` rather
+ than `Content-Length: 0`.
+- We correctly receive the response in buffered mode when uploading
+ chunked bodies.
+- We now handle being passed a query string as a bytestring on Python
+ 3, by decoding it as UTF-8.
+- Sessions are now closed in all cases (exceptional and not) when
+ using the functional API rather than leaking and waiting for the
+ garbage collector to clean them up.
+- Correctly handle digest auth headers with a malformed `qop`
+ directive that contains no token, by treating it the same as if no
+ `qop` directive was provided at all.
+- Minor performance improvements when removing specific cookies by
+ name.
+
+**Miscellaneous**
+
+- Updated urllib3 to 1.13.
+
+2.8.1 (2015-10-13)
+------------------
+
+**Bugfixes**
+
+- Update certificate bundle to match `certifi` 2015.9.6.2's weak
+ certificate bundle.
+- Fix a bug in 2.8.0 where requests would raise `ConnectTimeout`
+ instead of `ConnectionError`
+- When using the PreparedRequest flow, requests will now correctly
+ respect the `json` parameter. Broken in 2.8.0.
+- When using the PreparedRequest flow, requests will now correctly
+ handle a Unicode-string method name on Python 2. Broken in 2.8.0.
+
+2.8.0 (2015-10-05)
+------------------
+
+**Minor Improvements** (Backwards Compatible)
+
+- Requests now supports per-host proxies. This allows the `proxies`
+ dictionary to have entries of the form
+ `{'://': ''}`. Host-specific proxies will
+ be used in preference to the previously-supported scheme-specific
+ ones, but the previous syntax will continue to work.
+- `Response.raise_for_status` now prints the URL that failed as part
+ of the exception message.
+- `requests.utils.get_netrc_auth` now takes an `raise_errors` kwarg,
+ defaulting to `False`. When `True`, errors parsing `.netrc` files
+ cause exceptions to be thrown.
+- Change to bundled projects import logic to make it easier to
+ unbundle requests downstream.
+- Changed the default User-Agent string to avoid leaking data on
+ Linux: now contains only the requests version.
+
+**Bugfixes**
+
+- The `json` parameter to `post()` and friends will now only be used
+ if neither `data` nor `files` are present, consistent with the
+ documentation.
+- We now ignore empty fields in the `NO_PROXY` environment variable.
+- Fixed problem where `httplib.BadStatusLine` would get raised if
+ combining `stream=True` with `contextlib.closing`.
+- Prevented bugs where we would attempt to return the same connection
+ back to the connection pool twice when sending a Chunked body.
+- Miscellaneous minor internal changes.
+- Digest Auth support is now thread safe.
+
+**Updates**
+
+- Updated urllib3 to 1.12.
+
+2.7.0 (2015-05-03)
+------------------
+
+This is the first release that follows our new release process. For
+more, see [our
+documentation](http://docs.python-requests.org/en/latest/community/release-process/).
+
+**Bugfixes**
+
+- Updated urllib3 to 1.10.4, resolving several bugs involving chunked
+ transfer encoding and response framing.
+
+2.6.2 (2015-04-23)
+------------------
+
+**Bugfixes**
+
+- Fix regression where compressed data that was sent as chunked data
+ was not properly decompressed. (\#2561)
+
+2.6.1 (2015-04-22)
+------------------
+
+**Bugfixes**
+
+- Remove VendorAlias import machinery introduced in v2.5.2.
+- Simplify the PreparedRequest.prepare API: We no longer require the
+ user to pass an empty list to the hooks keyword argument. (c.f.
+ \#2552)
+- Resolve redirects now receives and forwards all of the original
+ arguments to the adapter. (\#2503)
+- Handle UnicodeDecodeErrors when trying to deal with a unicode URL
+ that cannot be encoded in ASCII. (\#2540)
+- Populate the parsed path of the URI field when performing Digest
+ Authentication. (\#2426)
+- Copy a PreparedRequest's CookieJar more reliably when it is not an
+ instance of RequestsCookieJar. (\#2527)
+
+2.6.0 (2015-03-14)
+------------------
+
+**Bugfixes**
+
+- CVE-2015-2296: Fix handling of cookies on redirect. Previously a
+ cookie without a host value set would use the hostname for the
+ redirected URL exposing requests users to session fixation attacks
+ and potentially cookie stealing. This was disclosed privately by
+ Matthew Daley of [BugFuzz](https://bugfuzz.com). This affects all
+ versions of requests from v2.1.0 to v2.5.3 (inclusive on both ends).
+- Fix error when requests is an `install_requires` dependency and
+ `python setup.py test` is run. (\#2462)
+- Fix error when urllib3 is unbundled and requests continues to use
+ the vendored import location.
+- Include fixes to `urllib3`'s header handling.
+- Requests' handling of unvendored dependencies is now more
+ restrictive.
+
+**Features and Improvements**
+
+- Support bytearrays when passed as parameters in the `files`
+ argument. (\#2468)
+- Avoid data duplication when creating a request with `str`, `bytes`,
+ or `bytearray` input to the `files` argument.
+
+2.5.3 (2015-02-24)
+------------------
+
+**Bugfixes**
+
+- Revert changes to our vendored certificate bundle. For more context
+ see (\#2455, \#2456, and )
+
+2.5.2 (2015-02-23)
+------------------
+
+**Features and Improvements**
+
+- Add sha256 fingerprint support.
+ ([shazow/urllib3\#540](https://github.com/shazow/urllib3/pull/540))
+- Improve the performance of headers.
+ ([shazow/urllib3\#544](https://github.com/shazow/urllib3/pull/544))
+
+**Bugfixes**
+
+- Copy pip's import machinery. When downstream redistributors remove
+ requests.packages.urllib3 the import machinery will continue to let
+ those same symbols work. Example usage in requests' documentation
+ and 3rd-party libraries relying on the vendored copies of urllib3
+ will work without having to fallback to the system urllib3.
+- Attempt to quote parts of the URL on redirect if unquoting and then
+ quoting fails. (\#2356)
+- Fix filename type check for multipart form-data uploads. (\#2411)
+- Properly handle the case where a server issuing digest
+ authentication challenges provides both auth and auth-int
+ qop-values. (\#2408)
+- Fix a socket leak.
+ ([shazow/urllib3\#549](https://github.com/shazow/urllib3/pull/549))
+- Fix multiple `Set-Cookie` headers properly.
+ ([shazow/urllib3\#534](https://github.com/shazow/urllib3/pull/534))
+- Disable the built-in hostname verification.
+ ([shazow/urllib3\#526](https://github.com/shazow/urllib3/pull/526))
+- Fix the behaviour of decoding an exhausted stream.
+ ([shazow/urllib3\#535](https://github.com/shazow/urllib3/pull/535))
+
+**Security**
+
+- Pulled in an updated `cacert.pem`.
+- Drop RC4 from the default cipher list.
+ ([shazow/urllib3\#551](https://github.com/shazow/urllib3/pull/551))
+
+2.5.1 (2014-12-23)
+------------------
+
+**Behavioural Changes**
+
+- Only catch HTTPErrors in raise\_for\_status (\#2382)
+
+**Bugfixes**
+
+- Handle LocationParseError from urllib3 (\#2344)
+- Handle file-like object filenames that are not strings (\#2379)
+- Unbreak HTTPDigestAuth handler. Allow new nonces to be negotiated
+ (\#2389)
+
+2.5.0 (2014-12-01)
+------------------
+
+**Improvements**
+
+- Allow usage of urllib3's Retry object with HTTPAdapters (\#2216)
+- The `iter_lines` method on a response now accepts a delimiter with
+ which to split the content (\#2295)
+
+**Behavioural Changes**
+
+- Add deprecation warnings to functions in requests.utils that will be
+ removed in 3.0 (\#2309)
+- Sessions used by the functional API are always closed (\#2326)
+- Restrict requests to HTTP/1.1 and HTTP/1.0 (stop accepting HTTP/0.9)
+ (\#2323)
+
+**Bugfixes**
+
+- Only parse the URL once (\#2353)
+- Allow Content-Length header to always be overridden (\#2332)
+- Properly handle files in HTTPDigestAuth (\#2333)
+- Cap redirect\_cache size to prevent memory abuse (\#2299)
+- Fix HTTPDigestAuth handling of redirects after authenticating
+ successfully (\#2253)
+- Fix crash with custom method parameter to Session.request (\#2317)
+- Fix how Link headers are parsed using the regular expression library
+ (\#2271)
+
+**Documentation**
+
+- Add more references for interlinking (\#2348)
+- Update CSS for theme (\#2290)
+- Update width of buttons and sidebar (\#2289)
+- Replace references of Gittip with Gratipay (\#2282)
+- Add link to changelog in sidebar (\#2273)
+
+2.4.3 (2014-10-06)
+------------------
+
+**Bugfixes**
+
+- Unicode URL improvements for Python 2.
+- Re-order JSON param for backwards compat.
+- Automatically defrag authentication schemes from host/pass URIs.
+ ([\#2249](https://github.com/requests/requests/issues/2249))
+
+2.4.2 (2014-10-05)
+------------------
+
+**Improvements**
+
+- FINALLY! Add json parameter for uploads!
+ ([\#2258](https://github.com/requests/requests/pull/2258))
+- Support for bytestring URLs on Python 3.x
+ ([\#2238](https://github.com/requests/requests/pull/2238))
+
+**Bugfixes**
+
+- Avoid getting stuck in a loop
+ ([\#2244](https://github.com/requests/requests/pull/2244))
+- Multiple calls to iter\* fail with unhelpful error.
+ ([\#2240](https://github.com/requests/requests/issues/2240),
+ [\#2241](https://github.com/requests/requests/issues/2241))
+
+**Documentation**
+
+- Correct redirection introduction
+ ([\#2245](https://github.com/requests/requests/pull/2245/))
+- Added example of how to send multiple files in one request.
+ ([\#2227](https://github.com/requests/requests/pull/2227/))
+- Clarify how to pass a custom set of CAs
+ ([\#2248](https://github.com/requests/requests/pull/2248/))
+
+2.4.1 (2014-09-09)
+------------------
+
+- Now has a "security" package extras set,
+ `$ pip install requests[security]`
+- Requests will now use Certifi if it is available.
+- Capture and re-raise urllib3 ProtocolError
+- Bugfix for responses that attempt to redirect to themselves forever
+ (wtf?).
+
+2.4.0 (2014-08-29)
+------------------
+
+**Behavioral Changes**
+
+- `Connection: keep-alive` header is now sent automatically.
+
+**Improvements**
+
+- Support for connect timeouts! Timeout now accepts a tuple (connect,
+ read) which is used to set individual connect and read timeouts.
+- Allow copying of PreparedRequests without headers/cookies.
+- Updated bundled urllib3 version.
+- Refactored settings loading from environment -- new
+ Session.merge\_environment\_settings.
+- Handle socket errors in iter\_content.
+
+2.3.0 (2014-05-16)
+------------------
+
+**API Changes**
+
+- New `Response` property `is_redirect`, which is true when the
+ library could have processed this response as a redirection (whether
+ or not it actually did).
+- The `timeout` parameter now affects requests with both `stream=True`
+ and `stream=False` equally.
+- The change in v2.0.0 to mandate explicit proxy schemes has been
+ reverted. Proxy schemes now default to `http://`.
+- The `CaseInsensitiveDict` used for HTTP headers now behaves like a
+ normal dictionary when references as string or viewed in the
+ interpreter.
+
+**Bugfixes**
+
+- No longer expose Authorization or Proxy-Authorization headers on
+ redirect. Fix CVE-2014-1829 and CVE-2014-1830 respectively.
+- Authorization is re-evaluated each redirect.
+- On redirect, pass url as native strings.
+- Fall-back to autodetected encoding for JSON when Unicode detection
+ fails.
+- Headers set to `None` on the `Session` are now correctly not sent.
+- Correctly honor `decode_unicode` even if it wasn't used earlier in
+ the same response.
+- Stop advertising `compress` as a supported Content-Encoding.
+- The `Response.history` parameter is now always a list.
+- Many, many `urllib3` bugfixes.
+
+2.2.1 (2014-01-23)
+------------------
+
+**Bugfixes**
+
+- Fixes incorrect parsing of proxy credentials that contain a literal
+ or encoded '\#' character.
+- Assorted urllib3 fixes.
+
+2.2.0 (2014-01-09)
+------------------
+
+**API Changes**
+
+- New exception: `ContentDecodingError`. Raised instead of `urllib3`
+ `DecodeError` exceptions.
+
+**Bugfixes**
+
+- Avoid many many exceptions from the buggy implementation of
+ `proxy_bypass` on OS X in Python 2.6.
+- Avoid crashing when attempting to get authentication credentials
+ from \~/.netrc when running as a user without a home directory.
+- Use the correct pool size for pools of connections to proxies.
+- Fix iteration of `CookieJar` objects.
+- Ensure that cookies are persisted over redirect.
+- Switch back to using chardet, since it has merged with charade.
+
+2.1.0 (2013-12-05)
+------------------
+
+- Updated CA Bundle, of course.
+- Cookies set on individual Requests through a `Session` (e.g. via
+ `Session.get()`) are no longer persisted to the `Session`.
+- Clean up connections when we hit problems during chunked upload,
+ rather than leaking them.
+- Return connections to the pool when a chunked upload is successful,
+ rather than leaking it.
+- Match the HTTPbis recommendation for HTTP 301 redirects.
+- Prevent hanging when using streaming uploads and Digest Auth when a
+ 401 is received.
+- Values of headers set by Requests are now always the native string
+ type.
+- Fix previously broken SNI support.
+- Fix accessing HTTP proxies using proxy authentication.
+- Unencode HTTP Basic usernames and passwords extracted from URLs.
+- Support for IP address ranges for no\_proxy environment variable
+- Parse headers correctly when users override the default `Host:`
+ header.
+- Avoid munging the URL in case of case-sensitive servers.
+- Looser URL handling for non-HTTP/HTTPS urls.
+- Accept unicode methods in Python 2.6 and 2.7.
+- More resilient cookie handling.
+- Make `Response` objects pickleable.
+- Actually added MD5-sess to Digest Auth instead of pretending to like
+ last time.
+- Updated internal urllib3.
+- Fixed @Lukasa's lack of taste.
+
+2.0.1 (2013-10-24)
+------------------
+
+- Updated included CA Bundle with new mistrusts and automated process
+ for the future
+- Added MD5-sess to Digest Auth
+- Accept per-file headers in multipart file POST messages.
+- Fixed: Don't send the full URL on CONNECT messages.
+- Fixed: Correctly lowercase a redirect scheme.
+- Fixed: Cookies not persisted when set via functional API.
+- Fixed: Translate urllib3 ProxyError into a requests ProxyError
+ derived from ConnectionError.
+- Updated internal urllib3 and chardet.
+
+2.0.0 (2013-09-24)
+------------------
+
+**API Changes:**
+
+- Keys in the Headers dictionary are now native strings on all Python
+ versions, i.e. bytestrings on Python 2, unicode on Python 3.
+- Proxy URLs now *must* have an explicit scheme. A `MissingSchema`
+ exception will be raised if they don't.
+- Timeouts now apply to read time if `Stream=False`.
+- `RequestException` is now a subclass of `IOError`, not
+ `RuntimeError`.
+- Added new method to `PreparedRequest` objects:
+ `PreparedRequest.copy()`.
+- Added new method to `Session` objects: `Session.update_request()`.
+ This method updates a `Request` object with the data (e.g. cookies)
+ stored on the `Session`.
+- Added new method to `Session` objects: `Session.prepare_request()`.
+ This method updates and prepares a `Request` object, and returns the
+ corresponding `PreparedRequest` object.
+- Added new method to `HTTPAdapter` objects:
+ `HTTPAdapter.proxy_headers()`. This should not be called directly,
+ but improves the subclass interface.
+- `httplib.IncompleteRead` exceptions caused by incorrect chunked
+ encoding will now raise a Requests `ChunkedEncodingError` instead.
+- Invalid percent-escape sequences now cause a Requests `InvalidURL`
+ exception to be raised.
+- HTTP 208 no longer uses reason phrase `"im_used"`. Correctly uses
+ `"already_reported"`.
+- HTTP 226 reason added (`"im_used"`).
+
+**Bugfixes:**
+
+- Vastly improved proxy support, including the CONNECT verb. Special
+ thanks to the many contributors who worked towards this improvement.
+- Cookies are now properly managed when 401 authentication responses
+ are received.
+- Chunked encoding fixes.
+- Support for mixed case schemes.
+- Better handling of streaming downloads.
+- Retrieve environment proxies from more locations.
+- Minor cookies fixes.
+- Improved redirect behaviour.
+- Improved streaming behaviour, particularly for compressed data.
+- Miscellaneous small Python 3 text encoding bugs.
+- `.netrc` no longer overrides explicit auth.
+- Cookies set by hooks are now correctly persisted on Sessions.
+- Fix problem with cookies that specify port numbers in their host
+ field.
+- `BytesIO` can be used to perform streaming uploads.
+- More generous parsing of the `no_proxy` environment variable.
+- Non-string objects can be passed in data values alongside files.
+
+1.2.3 (2013-05-25)
+------------------
+
+- Simple packaging fix
+
+1.2.2 (2013-05-23)
+------------------
+
+- Simple packaging fix
+
+1.2.1 (2013-05-20)
+------------------
+
+- 301 and 302 redirects now change the verb to GET for all verbs, not
+ just POST, improving browser compatibility.
+- Python 3.3.2 compatibility
+- Always percent-encode location headers
+- Fix connection adapter matching to be most-specific first
+- new argument to the default connection adapter for passing a block
+ argument
+- prevent a KeyError when there's no link headers
+
+1.2.0 (2013-03-31)
+------------------
+
+- Fixed cookies on sessions and on requests
+- Significantly change how hooks are dispatched - hooks now receive
+ all the arguments specified by the user when making a request so
+ hooks can make a secondary request with the same parameters. This is
+ especially necessary for authentication handler authors
+- certifi support was removed
+- Fixed bug where using OAuth 1 with body `signature_type` sent no
+ data
+- Major proxy work thanks to @Lukasa including parsing of proxy
+ authentication from the proxy url
+- Fix DigestAuth handling too many 401s
+- Update vendored urllib3 to include SSL bug fixes
+- Allow keyword arguments to be passed to `json.loads()` via the
+ `Response.json()` method
+- Don't send `Content-Length` header by default on `GET` or `HEAD`
+ requests
+- Add `elapsed` attribute to `Response` objects to time how long a
+ request took.
+- Fix `RequestsCookieJar`
+- Sessions and Adapters are now picklable, i.e., can be used with the
+ multiprocessing library
+- Update charade to version 1.0.3
+
+The change in how hooks are dispatched will likely cause a great deal of
+issues.
+
+1.1.0 (2013-01-10)
+------------------
+
+- CHUNKED REQUESTS
+- Support for iterable response bodies
+- Assume servers persist redirect params
+- Allow explicit content types to be specified for file data
+- Make merge\_kwargs case-insensitive when looking up keys
+
+1.0.3 (2012-12-18)
+------------------
+
+- Fix file upload encoding bug
+- Fix cookie behavior
+
+1.0.2 (2012-12-17)
+------------------
+
+- Proxy fix for HTTPAdapter.
+
+1.0.1 (2012-12-17)
+------------------
+
+- Cert verification exception bug.
+- Proxy fix for HTTPAdapter.
+
+1.0.0 (2012-12-17)
+------------------
+
+- Massive Refactor and Simplification
+- Switch to Apache 2.0 license
+- Swappable Connection Adapters
+- Mountable Connection Adapters
+- Mutable ProcessedRequest chain
+- /s/prefetch/stream
+- Removal of all configuration
+- Standard library logging
+- Make Response.json() callable, not property.
+- Usage of new charade project, which provides python 2 and 3
+ simultaneous chardet.
+- Removal of all hooks except 'response'
+- Removal of all authentication helpers (OAuth, Kerberos)
+
+This is not a backwards compatible change.
+
+0.14.2 (2012-10-27)
+-------------------
+
+- Improved mime-compatible JSON handling
+- Proxy fixes
+- Path hack fixes
+- Case-Insensitive Content-Encoding headers
+- Support for CJK parameters in form posts
+
+0.14.1 (2012-10-01)
+-------------------
+
+- Python 3.3 Compatibility
+- Simply default accept-encoding
+- Bugfixes
+
+0.14.0 (2012-09-02)
+-------------------
+
+- No more iter\_content errors if already downloaded.
+
+0.13.9 (2012-08-25)
+-------------------
+
+- Fix for OAuth + POSTs
+- Remove exception eating from dispatch\_hook
+- General bugfixes
+
+0.13.8 (2012-08-21)
+-------------------
+
+- Incredible Link header support :)
+
+0.13.7 (2012-08-19)
+-------------------
+
+- Support for (key, value) lists everywhere.
+- Digest Authentication improvements.
+- Ensure proxy exclusions work properly.
+- Clearer UnicodeError exceptions.
+- Automatic casting of URLs to strings (fURL and such)
+- Bugfixes.
+
+0.13.6 (2012-08-06)
+-------------------
+
+- Long awaited fix for hanging connections!
+
+0.13.5 (2012-07-27)
+-------------------
+
+- Packaging fix
+
+0.13.4 (2012-07-27)
+-------------------
+
+- GSSAPI/Kerberos authentication!
+- App Engine 2.7 Fixes!
+- Fix leaking connections (from urllib3 update)
+- OAuthlib path hack fix
+- OAuthlib URL parameters fix.
+
+0.13.3 (2012-07-12)
+-------------------
+
+- Use simplejson if available.
+- Do not hide SSLErrors behind Timeouts.
+- Fixed param handling with urls containing fragments.
+- Significantly improved information in User Agent.
+- client certificates are ignored when verify=False
+
+0.13.2 (2012-06-28)
+-------------------
+
+- Zero dependencies (once again)!
+- New: Response.reason
+- Sign querystring parameters in OAuth 1.0
+- Client certificates no longer ignored when verify=False
+- Add openSUSE certificate support
+
+0.13.1 (2012-06-07)
+-------------------
+
+- Allow passing a file or file-like object as data.
+- Allow hooks to return responses that indicate errors.
+- Fix Response.text and Response.json for body-less responses.
+
+0.13.0 (2012-05-29)
+-------------------
+
+- Removal of Requests.async in favor of
+ [grequests](https://github.com/kennethreitz/grequests)
+- Allow disabling of cookie persistence.
+- New implementation of safe\_mode
+- cookies.get now supports default argument
+- Session cookies not saved when Session.request is called with
+ return\_response=False
+- Env: no\_proxy support.
+- RequestsCookieJar improvements.
+- Various bug fixes.
+
+0.12.1 (2012-05-08)
+-------------------
+
+- New `Response.json` property.
+- Ability to add string file uploads.
+- Fix out-of-range issue with iter\_lines.
+- Fix iter\_content default size.
+- Fix POST redirects containing files.
+
+0.12.0 (2012-05-02)
+-------------------
+
+- EXPERIMENTAL OAUTH SUPPORT!
+- Proper CookieJar-backed cookies interface with awesome dict-like
+ interface.
+- Speed fix for non-iterated content chunks.
+- Move `pre_request` to a more usable place.
+- New `pre_send` hook.
+- Lazily encode data, params, files.
+- Load system Certificate Bundle if `certify` isn't available.
+- Cleanups, fixes.
+
+0.11.2 (2012-04-22)
+-------------------
+
+- Attempt to use the OS's certificate bundle if `certifi` isn't
+ available.
+- Infinite digest auth redirect fix.
+- Multi-part file upload improvements.
+- Fix decoding of invalid %encodings in URLs.
+- If there is no content in a response don't throw an error the second
+ time that content is attempted to be read.
+- Upload data on redirects.
+
+0.11.1 (2012-03-30)
+-------------------
+
+- POST redirects now break RFC to do what browsers do: Follow up with
+ a GET.
+- New `strict_mode` configuration to disable new redirect behavior.
+
+0.11.0 (2012-03-14)
+-------------------
+
+- Private SSL Certificate support
+- Remove select.poll from Gevent monkeypatching
+- Remove redundant generator for chunked transfer encoding
+- Fix: Response.ok raises Timeout Exception in safe\_mode
+
+0.10.8 (2012-03-09)
+-------------------
+
+- Generate chunked ValueError fix
+- Proxy configuration by environment variables
+- Simplification of iter\_lines.
+- New trust\_env configuration for disabling system/environment hints.
+- Suppress cookie errors.
+
+0.10.7 (2012-03-07)
+-------------------
+
+- encode\_uri = False
+
+0.10.6 (2012-02-25)
+-------------------
+
+- Allow '=' in cookies.
+
+0.10.5 (2012-02-25)
+-------------------
+
+- Response body with 0 content-length fix.
+- New async.imap.
+- Don't fail on netrc.
+
+0.10.4 (2012-02-20)
+-------------------
+
+- Honor netrc.
+
+0.10.3 (2012-02-20)
+-------------------
+
+- HEAD requests don't follow redirects anymore.
+- raise\_for\_status() doesn't raise for 3xx anymore.
+- Make Session objects picklable.
+- ValueError for invalid schema URLs.
+
+0.10.2 (2012-01-15)
+-------------------
+
+- Vastly improved URL quoting.
+- Additional allowed cookie key values.
+- Attempted fix for "Too many open files" Error
+- Replace unicode errors on first pass, no need for second pass.
+- Append '/' to bare-domain urls before query insertion.
+- Exceptions now inherit from RuntimeError.
+- Binary uploads + auth fix.
+- Bugfixes.
+
+0.10.1 (2012-01-23)
+-------------------
+
+- PYTHON 3 SUPPORT!
+- Dropped 2.5 Support. (*Backwards Incompatible*)
+
+0.10.0 (2012-01-21)
+-------------------
+
+- `Response.content` is now bytes-only. (*Backwards Incompatible*)
+- New `Response.text` is unicode-only.
+- If no `Response.encoding` is specified and `chardet` is available,
+ `Response.text` will guess an encoding.
+- Default to ISO-8859-1 (Western) encoding for "text" subtypes.
+- Removal of decode\_unicode. (*Backwards Incompatible*)
+- New multiple-hooks system.
+- New `Response.register_hook` for registering hooks within the
+ pipeline.
+- `Response.url` is now Unicode.
+
+0.9.3 (2012-01-18)
+------------------
+
+- SSL verify=False bugfix (apparent on windows machines).
+
+0.9.2 (2012-01-18)
+------------------
+
+- Asynchronous async.send method.
+- Support for proper chunk streams with boundaries.
+- session argument for Session classes.
+- Print entire hook tracebacks, not just exception instance.
+- Fix response.iter\_lines from pending next line.
+- Fix but in HTTP-digest auth w/ URI having query strings.
+- Fix in Event Hooks section.
+- Urllib3 update.
+
+0.9.1 (2012-01-06)
+------------------
+
+- danger\_mode for automatic Response.raise\_for\_status()
+- Response.iter\_lines refactor
+
+0.9.0 (2011-12-28)
+------------------
+
+- verify ssl is default.
+
+0.8.9 (2011-12-28)
+------------------
+
+- Packaging fix.
+
+0.8.8 (2011-12-28)
+------------------
+
+- SSL CERT VERIFICATION!
+- Release of Cerifi: Mozilla's cert list.
+- New 'verify' argument for SSL requests.
+- Urllib3 update.
+
+0.8.7 (2011-12-24)
+------------------
+
+- iter\_lines last-line truncation fix
+- Force safe\_mode for async requests
+- Handle safe\_mode exceptions more consistently
+- Fix iteration on null responses in safe\_mode
+
+0.8.6 (2011-12-18)
+------------------
+
+- Socket timeout fixes.
+- Proxy Authorization support.
+
+0.8.5 (2011-12-14)
+------------------
+
+- Response.iter\_lines!
+
+0.8.4 (2011-12-11)
+------------------
+
+- Prefetch bugfix.
+- Added license to installed version.
+
+0.8.3 (2011-11-27)
+------------------
+
+- Converted auth system to use simpler callable objects.
+- New session parameter to API methods.
+- Display full URL while logging.
+
+0.8.2 (2011-11-19)
+------------------
+
+- New Unicode decoding system, based on over-ridable
+ Response.encoding.
+- Proper URL slash-quote handling.
+- Cookies with `[`, `]`, and `_` allowed.
+
+0.8.1 (2011-11-15)
+------------------
+
+- URL Request path fix
+- Proxy fix.
+- Timeouts fix.
+
+0.8.0 (2011-11-13)
+------------------
+
+- Keep-alive support!
+- Complete removal of Urllib2
+- Complete removal of Poster
+- Complete removal of CookieJars
+- New ConnectionError raising
+- Safe\_mode for error catching
+- prefetch parameter for request methods
+- OPTION method
+- Async pool size throttling
+- File uploads send real names
+- Vendored in urllib3
+
+0.7.6 (2011-11-07)
+------------------
+
+- Digest authentication bugfix (attach query data to path)
+
+0.7.5 (2011-11-04)
+------------------
+
+- Response.content = None if there was an invalid response.
+- Redirection auth handling.
+
+0.7.4 (2011-10-26)
+------------------
+
+- Session Hooks fix.
+
+0.7.3 (2011-10-23)
+------------------
+
+- Digest Auth fix.
+
+0.7.2 (2011-10-23)
+------------------
+
+- PATCH Fix.
+
+0.7.1 (2011-10-23)
+------------------
+
+- Move away from urllib2 authentication handling.
+- Fully Remove AuthManager, AuthObject, &c.
+- New tuple-based auth system with handler callbacks.
+
+0.7.0 (2011-10-22)
+------------------
+
+- Sessions are now the primary interface.
+- Deprecated InvalidMethodException.
+- PATCH fix.
+- New config system (no more global settings).
+
+0.6.6 (2011-10-19)
+------------------
+
+- Session parameter bugfix (params merging).
+
+0.6.5 (2011-10-18)
+------------------
+
+- Offline (fast) test suite.
+- Session dictionary argument merging.
+
+0.6.4 (2011-10-13)
+------------------
+
+- Automatic decoding of unicode, based on HTTP Headers.
+- New `decode_unicode` setting.
+- Removal of `r.read/close` methods.
+- New `r.faw` interface for advanced response usage.\*
+- Automatic expansion of parameterized headers.
+
+0.6.3 (2011-10-13)
+------------------
+
+- Beautiful `requests.async` module, for making async requests w/
+ gevent.
+
+0.6.2 (2011-10-09)
+------------------
+
+- GET/HEAD obeys allow\_redirects=False.
+
+0.6.1 (2011-08-20)
+------------------
+
+- Enhanced status codes experience `\o/`
+- Set a maximum number of redirects (`settings.max_redirects`)
+- Full Unicode URL support
+- Support for protocol-less redirects.
+- Allow for arbitrary request types.
+- Bugfixes
+
+0.6.0 (2011-08-17)
+------------------
+
+- New callback hook system
+- New persistent sessions object and context manager
+- Transparent Dict-cookie handling
+- Status code reference object
+- Removed Response.cached
+- Added Response.request
+- All args are kwargs
+- Relative redirect support
+- HTTPError handling improvements
+- Improved https testing
+- Bugfixes
+
+0.5.1 (2011-07-23)
+------------------
+
+- International Domain Name Support!
+- Access headers without fetching entire body (`read()`)
+- Use lists as dicts for parameters
+- Add Forced Basic Authentication
+- Forced Basic is default authentication type
+- `python-requests.org` default User-Agent header
+- CaseInsensitiveDict lower-case caching
+- Response.history bugfix
+
+0.5.0 (2011-06-21)
+------------------
+
+- PATCH Support
+- Support for Proxies
+- HTTPBin Test Suite
+- Redirect Fixes
+- settings.verbose stream writing
+- Querystrings for all methods
+- URLErrors (Connection Refused, Timeout, Invalid URLs) are treated as
+ explicitly raised
+ `r.requests.get('hwe://blah'); r.raise_for_status()`
+
+0.4.1 (2011-05-22)
+------------------
+
+- Improved Redirection Handling
+- New 'allow\_redirects' param for following non-GET/HEAD Redirects
+- Settings module refactoring
+
+0.4.0 (2011-05-15)
+------------------
+
+- Response.history: list of redirected responses
+- Case-Insensitive Header Dictionaries!
+- Unicode URLs
+
+0.3.4 (2011-05-14)
+------------------
+
+- Urllib2 HTTPAuthentication Recursion fix (Basic/Digest)
+- Internal Refactor
+- Bytes data upload Bugfix
+
+0.3.3 (2011-05-12)
+------------------
+
+- Request timeouts
+- Unicode url-encoded data
+- Settings context manager and module
+
+0.3.2 (2011-04-15)
+------------------
+
+- Automatic Decompression of GZip Encoded Content
+- AutoAuth Support for Tupled HTTP Auth
+
+0.3.1 (2011-04-01)
+------------------
+
+- Cookie Changes
+- Response.read()
+- Poster fix
+
+0.3.0 (2011-02-25)
+------------------
+
+- Automatic Authentication API Change
+- Smarter Query URL Parameterization
+- Allow file uploads and POST data together
+-
+
+ New Authentication Manager System
+
+ : - Simpler Basic HTTP System
+ - Supports all build-in urllib2 Auths
+ - Allows for custom Auth Handlers
+
+0.2.4 (2011-02-19)
+------------------
+
+- Python 2.5 Support
+- PyPy-c v1.4 Support
+- Auto-Authentication tests
+- Improved Request object constructor
+
+0.2.3 (2011-02-15)
+------------------
+
+-
+
+ New HTTPHandling Methods
+
+ : - Response.\_\_nonzero\_\_ (false if bad HTTP Status)
+ - Response.ok (True if expected HTTP Status)
+ - Response.error (Logged HTTPError if bad HTTP Status)
+ - Response.raise\_for\_status() (Raises stored HTTPError)
+
+0.2.2 (2011-02-14)
+------------------
+
+- Still handles request in the event of an HTTPError. (Issue \#2)
+- Eventlet and Gevent Monkeypatch support.
+- Cookie Support (Issue \#1)
+
+0.2.1 (2011-02-14)
+------------------
+
+- Added file attribute to POST and PUT requests for multipart-encode
+ file uploads.
+- Added Request.url attribute for context and redirects
+
+0.2.0 (2011-02-14)
+------------------
+
+- Birth!
+
+0.0.1 (2011-02-13)
+------------------
+
+- Frustration
+- Conception
+
diff --git a/HISTORY.rst b/HISTORY.rst
deleted file mode 100644
index d634e1fdf8..0000000000
--- a/HISTORY.rst
+++ /dev/null
@@ -1,1527 +0,0 @@
-.. :changelog:
-
-Release History
----------------
-
-2.18.4 (2017-08-15)
-+++++++++++++++++++
-
-**Improvements**
-
-- Error messages for invalid headers now include the header name for easier debugging
-
-**Dependencies**
-
-- We now support idna v2.6.
-
-2.18.3 (2017-08-02)
-+++++++++++++++++++
-
-**Improvements**
-
-- Running ``$ python -m requests.help`` now includes the installed version of idna.
-
-**Bugfixes**
-
-- Fixed issue where Requests would raise ``ConnectionError`` instead of
- ``SSLError`` when encountering SSL problems when using urllib3 v1.22.
-
-2.18.2 (2017-07-25)
-+++++++++++++++++++
-
-**Bugfixes**
-
-- ``requests.help`` no longer fails on Python 2.6 due to the absence of
- ``ssl.OPENSSL_VERSION_NUMBER``.
-
-**Dependencies**
-
-- We now support urllib3 v1.22.
-
-2.18.1 (2017-06-14)
-+++++++++++++++++++
-
-**Bugfixes**
-
-- Fix an error in the packaging whereby the *.whl contained incorrect data that
- regressed the fix in v2.17.3.
-
-2.18.0 (2017-06-14)
-+++++++++++++++++++
-
-**Improvements**
-
-- ``Response`` is now a context manager, so can be used directly in a ``with`` statement
- without first having to be wrapped by ``contextlib.closing()``.
-
-**Bugfixes**
-
-- Resolve installation failure if multiprocessing is not available
-- Resolve tests crash if multiprocessing is not able to determine the number of CPU cores
-- Resolve error swallowing in utils set_environ generator
-
-
-2.17.3 (2017-05-29)
-+++++++++++++++++++
-
-**Improvements**
-
-- Improved ``packages`` namespace identity support, for monkeypatching libraries.
-
-
-2.17.2 (2017-05-29)
-+++++++++++++++++++
-
-**Improvements**
-
-- Improved ``packages`` namespace identity support, for monkeypatching libraries.
-
-
-2.17.1 (2017-05-29)
-+++++++++++++++++++
-
-**Improvements**
-
-- Improved ``packages`` namespace identity support, for monkeypatching libraries.
-
-
-2.17.0 (2017-05-29)
-+++++++++++++++++++
-
-**Improvements**
-
-- Removal of the 301 redirect cache. This improves thread-safety.
-
-
-2.16.5 (2017-05-28)
-+++++++++++++++++++
-
-- Improvements to ``$ python -m requests.help``.
-
-2.16.4 (2017-05-27)
-+++++++++++++++++++
-
-- Introduction of the ``$ python -m requests.help`` command, for debugging with maintainers!
-
-2.16.3 (2017-05-27)
-+++++++++++++++++++
-
-- Further restored the ``requests.packages`` namespace for compatibility reasons.
-
-2.16.2 (2017-05-27)
-+++++++++++++++++++
-
-- Further restored the ``requests.packages`` namespace for compatibility reasons.
-
-No code modification (noted below) should be neccessary any longer.
-
-2.16.1 (2017-05-27)
-+++++++++++++++++++
-
-- Restored the ``requests.packages`` namespace for compatibility reasons.
-- Bugfix for ``urllib3`` version parsing.
-
-**Note**: code that was written to import against the ``requests.packages``
-namespace previously will have to import code that rests at this module-level
-now.
-
-For example::
-
- from requests.packages.urllib3.poolmanager import PoolManager
-
-Will need to be re-written to be::
-
- from requests.packages import urllib3
- urllib3.poolmanager.PoolManager
-
-Or, even better::
-
- from urllib3.poolmanager import PoolManager
-
-2.16.0 (2017-05-26)
-+++++++++++++++++++
-
-- Unvendor ALL the things!
-
-2.15.1 (2017-05-26)
-+++++++++++++++++++
-
-- Everyone makes mistakes.
-
-2.15.0 (2017-05-26)
-+++++++++++++++++++
-
-**Improvements**
-
-- Introduction of the ``Response.next`` property, for getting the next
- ``PreparedResponse`` from a redirect chain (when ``allow_redirects=False``).
-- Internal refactoring of ``__version__`` module.
-
-**Bugfixes**
-
-- Restored once-optional parameter for ``requests.utils.get_environ_proxies()``.
-
-2.14.2 (2017-05-10)
-+++++++++++++++++++
-
-**Bugfixes**
-
-- Changed a less-than to an equal-to and an or in the dependency markers to
- widen compatibility with older setuptools releases.
-
-2.14.1 (2017-05-09)
-+++++++++++++++++++
-
-**Bugfixes**
-
-- Changed the dependency markers to widen compatibility with older pip
- releases.
-
-2.14.0 (2017-05-09)
-+++++++++++++++++++
-
-**Improvements**
-
-- It is now possible to pass ``no_proxy`` as a key to the ``proxies``
- dictionary to provide handling similar to the ``NO_PROXY`` environment
- variable.
-- When users provide invalid paths to certificate bundle files or directories
- Requests now raises ``IOError``, rather than failing at the time of the HTTPS
- request with a fairly inscrutable certificate validation error.
-- The behavior of ``SessionRedirectMixin`` was slightly altered.
- ``resolve_redirects`` will now detect a redirect by calling
- ``get_redirect_target(response)`` instead of directly
- querying ``Response.is_redirect`` and ``Response.headers['location']``.
- Advanced users will be able to process malformed redirects more easily.
-- Changed the internal calculation of elapsed request time to have higher
- resolution on Windows.
-- Added ``win_inet_pton`` as conditional dependency for the ``[socks]`` extra
- on Windows with Python 2.7.
-- Changed the proxy bypass implementation on Windows: the proxy bypass
- check doesn't use forward and reverse DNS requests anymore
-- URLs with schemes that begin with ``http`` but are not ``http`` or ``https``
- no longer have their host parts forced to lowercase.
-
-**Bugfixes**
-
-- Much improved handling of non-ASCII ``Location`` header values in redirects.
- Fewer ``UnicodeDecodeErrors`` are encountered on Python 2, and Python 3 now
- correctly understands that Latin-1 is unlikely to be the correct encoding.
-- If an attempt to ``seek`` file to find out its length fails, we now
- appropriately handle that by aborting our content-length calculations.
-- Restricted ``HTTPDigestAuth`` to only respond to auth challenges made on 4XX
- responses, rather than to all auth challenges.
-- Fixed some code that was firing ``DeprecationWarning`` on Python 3.6.
-- The dismayed person emoticon (``/o\\``) no longer has a big head. I'm sure
- this is what you were all worrying about most.
-
-
-**Miscellaneous**
-
-- Updated bundled urllib3 to v1.21.1.
-- Updated bundled chardet to v3.0.2.
-- Updated bundled idna to v2.5.
-- Updated bundled certifi to 2017.4.17.
-
-2.13.0 (2017-01-24)
-+++++++++++++++++++
-
-**Features**
-
-- Only load the ``idna`` library when we've determined we need it. This will
- save some memory for users.
-
-**Miscellaneous**
-
-- Updated bundled urllib3 to 1.20.
-- Updated bundled idna to 2.2.
-
-2.12.5 (2017-01-18)
-+++++++++++++++++++
-
-**Bugfixes**
-
-- Fixed an issue with JSON encoding detection, specifically detecting
- big-endian UTF-32 with BOM.
-
-2.12.4 (2016-12-14)
-+++++++++++++++++++
-
-**Bugfixes**
-
-- Fixed regression from 2.12.2 where non-string types were rejected in the
- basic auth parameters. While support for this behaviour has been readded,
- the behaviour is deprecated and will be removed in the future.
-
-2.12.3 (2016-12-01)
-+++++++++++++++++++
-
-**Bugfixes**
-
-- Fixed regression from v2.12.1 for URLs with schemes that begin with "http".
- These URLs have historically been processed as though they were HTTP-schemed
- URLs, and so have had parameters added. This was removed in v2.12.2 in an
- overzealous attempt to resolve problems with IDNA-encoding those URLs. This
- change was reverted: the other fixes for IDNA-encoding have been judged to
- be sufficient to return to the behaviour Requests had before v2.12.0.
-
-2.12.2 (2016-11-30)
-+++++++++++++++++++
-
-**Bugfixes**
-
-- Fixed several issues with IDNA-encoding URLs that are technically invalid but
- which are widely accepted. Requests will now attempt to IDNA-encode a URL if
- it can but, if it fails, and the host contains only ASCII characters, it will
- be passed through optimistically. This will allow users to opt-in to using
- IDNA2003 themselves if they want to, and will also allow technically invalid
- but still common hostnames.
-- Fixed an issue where URLs with leading whitespace would raise
- ``InvalidSchema`` errors.
-- Fixed an issue where some URLs without the HTTP or HTTPS schemes would still
- have HTTP URL preparation applied to them.
-- Fixed an issue where Unicode strings could not be used in basic auth.
-- Fixed an issue encountered by some Requests plugins where constructing a
- Response object would cause ``Response.content`` to raise an
- ``AttributeError``.
-
-2.12.1 (2016-11-16)
-+++++++++++++++++++
-
-**Bugfixes**
-
-- Updated setuptools 'security' extra for the new PyOpenSSL backend in urllib3.
-
-**Miscellaneous**
-
-- Updated bundled urllib3 to 1.19.1.
-
-2.12.0 (2016-11-15)
-+++++++++++++++++++
-
-**Improvements**
-
-- Updated support for internationalized domain names from IDNA2003 to IDNA2008.
- This updated support is required for several forms of IDNs and is mandatory
- for .de domains.
-- Much improved heuristics for guessing content lengths: Requests will no
- longer read an entire ``StringIO`` into memory.
-- Much improved logic for recalculating ``Content-Length`` headers for
- ``PreparedRequest`` objects.
-- Improved tolerance for file-like objects that have no ``tell`` method but
- do have a ``seek`` method.
-- Anything that is a subclass of ``Mapping`` is now treated like a dictionary
- by the ``data=`` keyword argument.
-- Requests now tolerates empty passwords in proxy credentials, rather than
- stripping the credentials.
-- If a request is made with a file-like object as the body and that request is
- redirected with a 307 or 308 status code, Requests will now attempt to
- rewind the body object so it can be replayed.
-
-**Bugfixes**
-
-- When calling ``response.close``, the call to ``close`` will be propagated
- through to non-urllib3 backends.
-- Fixed issue where the ``ALL_PROXY`` environment variable would be preferred
- over scheme-specific variables like ``HTTP_PROXY``.
-- Fixed issue where non-UTF8 reason phrases got severely mangled by falling
- back to decoding using ISO 8859-1 instead.
-- Fixed a bug where Requests would not correctly correlate cookies set when
- using custom Host headers if those Host headers did not use the native
- string type for the platform.
-
-**Miscellaneous**
-
-- Updated bundled urllib3 to 1.19.
-- Updated bundled certifi certs to 2016.09.26.
-
-2.11.1 (2016-08-17)
-+++++++++++++++++++
-
-**Bugfixes**
-
-- Fixed a bug when using ``iter_content`` with ``decode_unicode=True`` for
- streamed bodies would raise ``AttributeError``. This bug was introduced in
- 2.11.
-- Strip Content-Type and Transfer-Encoding headers from the header block when
- following a redirect that transforms the verb from POST/PUT to GET.
-
-2.11.0 (2016-08-08)
-+++++++++++++++++++
-
-**Improvements**
-
-- Added support for the ``ALL_PROXY`` environment variable.
-- Reject header values that contain leading whitespace or newline characters to
- reduce risk of header smuggling.
-
-**Bugfixes**
-
-- Fixed occasional ``TypeError`` when attempting to decode a JSON response that
- occurred in an error case. Now correctly returns a ``ValueError``.
-- Requests would incorrectly ignore a non-CIDR IP address in the ``NO_PROXY``
- environment variables: Requests now treats it as a specific IP.
-- Fixed a bug when sending JSON data that could cause us to encounter obscure
- OpenSSL errors in certain network conditions (yes, really).
-- Added type checks to ensure that ``iter_content`` only accepts integers and
- ``None`` for chunk sizes.
-- Fixed issue where responses whose body had not been fully consumed would have
- the underlying connection closed but not returned to the connection pool,
- which could cause Requests to hang in situations where the ``HTTPAdapter``
- had been configured to use a blocking connection pool.
-
-**Miscellaneous**
-
-- Updated bundled urllib3 to 1.16.
-- Some previous releases accidentally accepted non-strings as acceptable header values. This release does not.
-
-2.10.0 (2016-04-29)
-+++++++++++++++++++
-
-**New Features**
-
-- SOCKS Proxy Support! (requires PySocks; ``$ pip install requests[socks]``)
-
-**Miscellaneous**
-
-- Updated bundled urllib3 to 1.15.1.
-
-2.9.2 (2016-04-29)
-++++++++++++++++++
-
-**Improvements**
-
-- Change built-in CaseInsensitiveDict (used for headers) to use OrderedDict
- as its underlying datastore.
-
-**Bugfixes**
-
-- Don't use redirect_cache if allow_redirects=False
-- When passed objects that throw exceptions from ``tell()``, send them via
- chunked transfer encoding instead of failing.
-- Raise a ProxyError for proxy related connection issues.
-
-2.9.1 (2015-12-21)
-++++++++++++++++++
-
-**Bugfixes**
-
-- Resolve regression introduced in 2.9.0 that made it impossible to send binary
- strings as bodies in Python 3.
-- Fixed errors when calculating cookie expiration dates in certain locales.
-
-**Miscellaneous**
-
-- Updated bundled urllib3 to 1.13.1.
-
-2.9.0 (2015-12-15)
-++++++++++++++++++
-
-**Minor Improvements** (Backwards compatible)
-
-- The ``verify`` keyword argument now supports being passed a path to a
- directory of CA certificates, not just a single-file bundle.
-- Warnings are now emitted when sending files opened in text mode.
-- Added the 511 Network Authentication Required status code to the status code
- registry.
-
-**Bugfixes**
-
-- For file-like objects that are not seeked to the very beginning, we now
- send the content length for the number of bytes we will actually read, rather
- than the total size of the file, allowing partial file uploads.
-- When uploading file-like objects, if they are empty or have no obvious
- content length we set ``Transfer-Encoding: chunked`` rather than
- ``Content-Length: 0``.
-- We correctly receive the response in buffered mode when uploading chunked
- bodies.
-- We now handle being passed a query string as a bytestring on Python 3, by
- decoding it as UTF-8.
-- Sessions are now closed in all cases (exceptional and not) when using the
- functional API rather than leaking and waiting for the garbage collector to
- clean them up.
-- Correctly handle digest auth headers with a malformed ``qop`` directive that
- contains no token, by treating it the same as if no ``qop`` directive was
- provided at all.
-- Minor performance improvements when removing specific cookies by name.
-
-**Miscellaneous**
-
-- Updated urllib3 to 1.13.
-
-2.8.1 (2015-10-13)
-++++++++++++++++++
-
-**Bugfixes**
-
-- Update certificate bundle to match ``certifi`` 2015.9.6.2's weak certificate
- bundle.
-- Fix a bug in 2.8.0 where requests would raise ``ConnectTimeout`` instead of
- ``ConnectionError``
-- When using the PreparedRequest flow, requests will now correctly respect the
- ``json`` parameter. Broken in 2.8.0.
-- When using the PreparedRequest flow, requests will now correctly handle a
- Unicode-string method name on Python 2. Broken in 2.8.0.
-
-2.8.0 (2015-10-05)
-++++++++++++++++++
-
-**Minor Improvements** (Backwards Compatible)
-
-- Requests now supports per-host proxies. This allows the ``proxies``
- dictionary to have entries of the form
- ``{'://': ''}``. Host-specific proxies will be used
- in preference to the previously-supported scheme-specific ones, but the
- previous syntax will continue to work.
-- ``Response.raise_for_status`` now prints the URL that failed as part of the
- exception message.
-- ``requests.utils.get_netrc_auth`` now takes an ``raise_errors`` kwarg,
- defaulting to ``False``. When ``True``, errors parsing ``.netrc`` files cause
- exceptions to be thrown.
-- Change to bundled projects import logic to make it easier to unbundle
- requests downstream.
-- Changed the default User-Agent string to avoid leaking data on Linux: now
- contains only the requests version.
-
-**Bugfixes**
-
-- The ``json`` parameter to ``post()`` and friends will now only be used if
- neither ``data`` nor ``files`` are present, consistent with the
- documentation.
-- We now ignore empty fields in the ``NO_PROXY`` environment variable.
-- Fixed problem where ``httplib.BadStatusLine`` would get raised if combining
- ``stream=True`` with ``contextlib.closing``.
-- Prevented bugs where we would attempt to return the same connection back to
- the connection pool twice when sending a Chunked body.
-- Miscellaneous minor internal changes.
-- Digest Auth support is now thread safe.
-
-**Updates**
-
-- Updated urllib3 to 1.12.
-
-2.7.0 (2015-05-03)
-++++++++++++++++++
-
-This is the first release that follows our new release process. For more, see
-`our documentation
-`_.
-
-**Bugfixes**
-
-- Updated urllib3 to 1.10.4, resolving several bugs involving chunked transfer
- encoding and response framing.
-
-2.6.2 (2015-04-23)
-++++++++++++++++++
-
-**Bugfixes**
-
-- Fix regression where compressed data that was sent as chunked data was not
- properly decompressed. (#2561)
-
-2.6.1 (2015-04-22)
-++++++++++++++++++
-
-**Bugfixes**
-
-- Remove VendorAlias import machinery introduced in v2.5.2.
-
-- Simplify the PreparedRequest.prepare API: We no longer require the user to
- pass an empty list to the hooks keyword argument. (c.f. #2552)
-
-- Resolve redirects now receives and forwards all of the original arguments to
- the adapter. (#2503)
-
-- Handle UnicodeDecodeErrors when trying to deal with a unicode URL that
- cannot be encoded in ASCII. (#2540)
-
-- Populate the parsed path of the URI field when performing Digest
- Authentication. (#2426)
-
-- Copy a PreparedRequest's CookieJar more reliably when it is not an instance
- of RequestsCookieJar. (#2527)
-
-2.6.0 (2015-03-14)
-++++++++++++++++++
-
-**Bugfixes**
-
-- CVE-2015-2296: Fix handling of cookies on redirect. Previously a cookie
- without a host value set would use the hostname for the redirected URL
- exposing requests users to session fixation attacks and potentially cookie
- stealing. This was disclosed privately by Matthew Daley of
- `BugFuzz `_. This affects all versions of requests from
- v2.1.0 to v2.5.3 (inclusive on both ends).
-
-- Fix error when requests is an ``install_requires`` dependency and ``python
- setup.py test`` is run. (#2462)
-
-- Fix error when urllib3 is unbundled and requests continues to use the
- vendored import location.
-
-- Include fixes to ``urllib3``'s header handling.
-
-- Requests' handling of unvendored dependencies is now more restrictive.
-
-**Features and Improvements**
-
-- Support bytearrays when passed as parameters in the ``files`` argument.
- (#2468)
-
-- Avoid data duplication when creating a request with ``str``, ``bytes``, or
- ``bytearray`` input to the ``files`` argument.
-
-2.5.3 (2015-02-24)
-++++++++++++++++++
-
-**Bugfixes**
-
-- Revert changes to our vendored certificate bundle. For more context see
- (#2455, #2456, and http://bugs.python.org/issue23476)
-
-2.5.2 (2015-02-23)
-++++++++++++++++++
-
-**Features and Improvements**
-
-- Add sha256 fingerprint support. (`shazow/urllib3#540`_)
-
-- Improve the performance of headers. (`shazow/urllib3#544`_)
-
-**Bugfixes**
-
-- Copy pip's import machinery. When downstream redistributors remove
- requests.packages.urllib3 the import machinery will continue to let those
- same symbols work. Example usage in requests' documentation and 3rd-party
- libraries relying on the vendored copies of urllib3 will work without having
- to fallback to the system urllib3.
-
-- Attempt to quote parts of the URL on redirect if unquoting and then quoting
- fails. (#2356)
-
-- Fix filename type check for multipart form-data uploads. (#2411)
-
-- Properly handle the case where a server issuing digest authentication
- challenges provides both auth and auth-int qop-values. (#2408)
-
-- Fix a socket leak. (`shazow/urllib3#549`_)
-
-- Fix multiple ``Set-Cookie`` headers properly. (`shazow/urllib3#534`_)
-
-- Disable the built-in hostname verification. (`shazow/urllib3#526`_)
-
-- Fix the behaviour of decoding an exhausted stream. (`shazow/urllib3#535`_)
-
-**Security**
-
-- Pulled in an updated ``cacert.pem``.
-
-- Drop RC4 from the default cipher list. (`shazow/urllib3#551`_)
-
-.. _shazow/urllib3#551: https://github.com/shazow/urllib3/pull/551
-.. _shazow/urllib3#549: https://github.com/shazow/urllib3/pull/549
-.. _shazow/urllib3#544: https://github.com/shazow/urllib3/pull/544
-.. _shazow/urllib3#540: https://github.com/shazow/urllib3/pull/540
-.. _shazow/urllib3#535: https://github.com/shazow/urllib3/pull/535
-.. _shazow/urllib3#534: https://github.com/shazow/urllib3/pull/534
-.. _shazow/urllib3#526: https://github.com/shazow/urllib3/pull/526
-
-2.5.1 (2014-12-23)
-++++++++++++++++++
-
-**Behavioural Changes**
-
-- Only catch HTTPErrors in raise_for_status (#2382)
-
-**Bugfixes**
-
-- Handle LocationParseError from urllib3 (#2344)
-- Handle file-like object filenames that are not strings (#2379)
-- Unbreak HTTPDigestAuth handler. Allow new nonces to be negotiated (#2389)
-
-2.5.0 (2014-12-01)
-++++++++++++++++++
-
-**Improvements**
-
-- Allow usage of urllib3's Retry object with HTTPAdapters (#2216)
-- The ``iter_lines`` method on a response now accepts a delimiter with which
- to split the content (#2295)
-
-**Behavioural Changes**
-
-- Add deprecation warnings to functions in requests.utils that will be removed
- in 3.0 (#2309)
-- Sessions used by the functional API are always closed (#2326)
-- Restrict requests to HTTP/1.1 and HTTP/1.0 (stop accepting HTTP/0.9) (#2323)
-
-**Bugfixes**
-
-- Only parse the URL once (#2353)
-- Allow Content-Length header to always be overridden (#2332)
-- Properly handle files in HTTPDigestAuth (#2333)
-- Cap redirect_cache size to prevent memory abuse (#2299)
-- Fix HTTPDigestAuth handling of redirects after authenticating successfully
- (#2253)
-- Fix crash with custom method parameter to Session.request (#2317)
-- Fix how Link headers are parsed using the regular expression library (#2271)
-
-**Documentation**
-
-- Add more references for interlinking (#2348)
-- Update CSS for theme (#2290)
-- Update width of buttons and sidebar (#2289)
-- Replace references of Gittip with Gratipay (#2282)
-- Add link to changelog in sidebar (#2273)
-
-2.4.3 (2014-10-06)
-++++++++++++++++++
-
-**Bugfixes**
-
-- Unicode URL improvements for Python 2.
-- Re-order JSON param for backwards compat.
-- Automatically defrag authentication schemes from host/pass URIs. (`#2249 `_)
-
-
-2.4.2 (2014-10-05)
-++++++++++++++++++
-
-**Improvements**
-
-- FINALLY! Add json parameter for uploads! (`#2258 `_)
-- Support for bytestring URLs on Python 3.x (`#2238 `_)
-
-**Bugfixes**
-
-- Avoid getting stuck in a loop (`#2244 `_)
-- Multiple calls to iter* fail with unhelpful error. (`#2240 `_, `#2241 `_)
-
-**Documentation**
-
-- Correct redirection introduction (`#2245 `_)
-- Added example of how to send multiple files in one request. (`#2227 `_)
-- Clarify how to pass a custom set of CAs (`#2248 `_)
-
-
-
-2.4.1 (2014-09-09)
-++++++++++++++++++
-
-- Now has a "security" package extras set, ``$ pip install requests[security]``
-- Requests will now use Certifi if it is available.
-- Capture and re-raise urllib3 ProtocolError
-- Bugfix for responses that attempt to redirect to themselves forever (wtf?).
-
-
-2.4.0 (2014-08-29)
-++++++++++++++++++
-
-**Behavioral Changes**
-
-- ``Connection: keep-alive`` header is now sent automatically.
-
-**Improvements**
-
-- Support for connect timeouts! Timeout now accepts a tuple (connect, read) which is used to set individual connect and read timeouts.
-- Allow copying of PreparedRequests without headers/cookies.
-- Updated bundled urllib3 version.
-- Refactored settings loading from environment -- new `Session.merge_environment_settings`.
-- Handle socket errors in iter_content.
-
-
-2.3.0 (2014-05-16)
-++++++++++++++++++
-
-**API Changes**
-
-- New ``Response`` property ``is_redirect``, which is true when the
- library could have processed this response as a redirection (whether
- or not it actually did).
-- The ``timeout`` parameter now affects requests with both ``stream=True`` and
- ``stream=False`` equally.
-- The change in v2.0.0 to mandate explicit proxy schemes has been reverted.
- Proxy schemes now default to ``http://``.
-- The ``CaseInsensitiveDict`` used for HTTP headers now behaves like a normal
- dictionary when references as string or viewed in the interpreter.
-
-**Bugfixes**
-
-- No longer expose Authorization or Proxy-Authorization headers on redirect.
- Fix CVE-2014-1829 and CVE-2014-1830 respectively.
-- Authorization is re-evaluated each redirect.
-- On redirect, pass url as native strings.
-- Fall-back to autodetected encoding for JSON when Unicode detection fails.
-- Headers set to ``None`` on the ``Session`` are now correctly not sent.
-- Correctly honor ``decode_unicode`` even if it wasn't used earlier in the same
- response.
-- Stop advertising ``compress`` as a supported Content-Encoding.
-- The ``Response.history`` parameter is now always a list.
-- Many, many ``urllib3`` bugfixes.
-
-2.2.1 (2014-01-23)
-++++++++++++++++++
-
-**Bugfixes**
-
-- Fixes incorrect parsing of proxy credentials that contain a literal or encoded '#' character.
-- Assorted urllib3 fixes.
-
-2.2.0 (2014-01-09)
-++++++++++++++++++
-
-**API Changes**
-
-- New exception: ``ContentDecodingError``. Raised instead of ``urllib3``
- ``DecodeError`` exceptions.
-
-**Bugfixes**
-
-- Avoid many many exceptions from the buggy implementation of ``proxy_bypass`` on OS X in Python 2.6.
-- Avoid crashing when attempting to get authentication credentials from ~/.netrc when running as a user without a home directory.
-- Use the correct pool size for pools of connections to proxies.
-- Fix iteration of ``CookieJar`` objects.
-- Ensure that cookies are persisted over redirect.
-- Switch back to using chardet, since it has merged with charade.
-
-2.1.0 (2013-12-05)
-++++++++++++++++++
-
-- Updated CA Bundle, of course.
-- Cookies set on individual Requests through a ``Session`` (e.g. via ``Session.get()``) are no longer persisted to the ``Session``.
-- Clean up connections when we hit problems during chunked upload, rather than leaking them.
-- Return connections to the pool when a chunked upload is successful, rather than leaking it.
-- Match the HTTPbis recommendation for HTTP 301 redirects.
-- Prevent hanging when using streaming uploads and Digest Auth when a 401 is received.
-- Values of headers set by Requests are now always the native string type.
-- Fix previously broken SNI support.
-- Fix accessing HTTP proxies using proxy authentication.
-- Unencode HTTP Basic usernames and passwords extracted from URLs.
-- Support for IP address ranges for no_proxy environment variable
-- Parse headers correctly when users override the default ``Host:`` header.
-- Avoid munging the URL in case of case-sensitive servers.
-- Looser URL handling for non-HTTP/HTTPS urls.
-- Accept unicode methods in Python 2.6 and 2.7.
-- More resilient cookie handling.
-- Make ``Response`` objects pickleable.
-- Actually added MD5-sess to Digest Auth instead of pretending to like last time.
-- Updated internal urllib3.
-- Fixed @Lukasa's lack of taste.
-
-2.0.1 (2013-10-24)
-++++++++++++++++++
-
-- Updated included CA Bundle with new mistrusts and automated process for the future
-- Added MD5-sess to Digest Auth
-- Accept per-file headers in multipart file POST messages.
-- Fixed: Don't send the full URL on CONNECT messages.
-- Fixed: Correctly lowercase a redirect scheme.
-- Fixed: Cookies not persisted when set via functional API.
-- Fixed: Translate urllib3 ProxyError into a requests ProxyError derived from ConnectionError.
-- Updated internal urllib3 and chardet.
-
-2.0.0 (2013-09-24)
-++++++++++++++++++
-
-**API Changes:**
-
-- Keys in the Headers dictionary are now native strings on all Python versions,
- i.e. bytestrings on Python 2, unicode on Python 3.
-- Proxy URLs now *must* have an explicit scheme. A ``MissingSchema`` exception
- will be raised if they don't.
-- Timeouts now apply to read time if ``Stream=False``.
-- ``RequestException`` is now a subclass of ``IOError``, not ``RuntimeError``.
-- Added new method to ``PreparedRequest`` objects: ``PreparedRequest.copy()``.
-- Added new method to ``Session`` objects: ``Session.update_request()``. This
- method updates a ``Request`` object with the data (e.g. cookies) stored on
- the ``Session``.
-- Added new method to ``Session`` objects: ``Session.prepare_request()``. This
- method updates and prepares a ``Request`` object, and returns the
- corresponding ``PreparedRequest`` object.
-- Added new method to ``HTTPAdapter`` objects: ``HTTPAdapter.proxy_headers()``.
- This should not be called directly, but improves the subclass interface.
-- ``httplib.IncompleteRead`` exceptions caused by incorrect chunked encoding
- will now raise a Requests ``ChunkedEncodingError`` instead.
-- Invalid percent-escape sequences now cause a Requests ``InvalidURL``
- exception to be raised.
-- HTTP 208 no longer uses reason phrase ``"im_used"``. Correctly uses
- ``"already_reported"``.
-- HTTP 226 reason added (``"im_used"``).
-
-**Bugfixes:**
-
-- Vastly improved proxy support, including the CONNECT verb. Special thanks to
- the many contributors who worked towards this improvement.
-- Cookies are now properly managed when 401 authentication responses are
- received.
-- Chunked encoding fixes.
-- Support for mixed case schemes.
-- Better handling of streaming downloads.
-- Retrieve environment proxies from more locations.
-- Minor cookies fixes.
-- Improved redirect behaviour.
-- Improved streaming behaviour, particularly for compressed data.
-- Miscellaneous small Python 3 text encoding bugs.
-- ``.netrc`` no longer overrides explicit auth.
-- Cookies set by hooks are now correctly persisted on Sessions.
-- Fix problem with cookies that specify port numbers in their host field.
-- ``BytesIO`` can be used to perform streaming uploads.
-- More generous parsing of the ``no_proxy`` environment variable.
-- Non-string objects can be passed in data values alongside files.
-
-1.2.3 (2013-05-25)
-++++++++++++++++++
-
-- Simple packaging fix
-
-
-1.2.2 (2013-05-23)
-++++++++++++++++++
-
-- Simple packaging fix
-
-
-1.2.1 (2013-05-20)
-++++++++++++++++++
-
-- 301 and 302 redirects now change the verb to GET for all verbs, not just
- POST, improving browser compatibility.
-- Python 3.3.2 compatibility
-- Always percent-encode location headers
-- Fix connection adapter matching to be most-specific first
-- new argument to the default connection adapter for passing a block argument
-- prevent a KeyError when there's no link headers
-
-1.2.0 (2013-03-31)
-++++++++++++++++++
-
-- Fixed cookies on sessions and on requests
-- Significantly change how hooks are dispatched - hooks now receive all the
- arguments specified by the user when making a request so hooks can make a
- secondary request with the same parameters. This is especially necessary for
- authentication handler authors
-- certifi support was removed
-- Fixed bug where using OAuth 1 with body ``signature_type`` sent no data
-- Major proxy work thanks to @Lukasa including parsing of proxy authentication
- from the proxy url
-- Fix DigestAuth handling too many 401s
-- Update vendored urllib3 to include SSL bug fixes
-- Allow keyword arguments to be passed to ``json.loads()`` via the
- ``Response.json()`` method
-- Don't send ``Content-Length`` header by default on ``GET`` or ``HEAD``
- requests
-- Add ``elapsed`` attribute to ``Response`` objects to time how long a request
- took.
-- Fix ``RequestsCookieJar``
-- Sessions and Adapters are now picklable, i.e., can be used with the
- multiprocessing library
-- Update charade to version 1.0.3
-
-The change in how hooks are dispatched will likely cause a great deal of
-issues.
-
-1.1.0 (2013-01-10)
-++++++++++++++++++
-
-- CHUNKED REQUESTS
-- Support for iterable response bodies
-- Assume servers persist redirect params
-- Allow explicit content types to be specified for file data
-- Make merge_kwargs case-insensitive when looking up keys
-
-1.0.3 (2012-12-18)
-++++++++++++++++++
-
-- Fix file upload encoding bug
-- Fix cookie behavior
-
-1.0.2 (2012-12-17)
-++++++++++++++++++
-
-- Proxy fix for HTTPAdapter.
-
-1.0.1 (2012-12-17)
-++++++++++++++++++
-
-- Cert verification exception bug.
-- Proxy fix for HTTPAdapter.
-
-1.0.0 (2012-12-17)
-++++++++++++++++++
-
-- Massive Refactor and Simplification
-- Switch to Apache 2.0 license
-- Swappable Connection Adapters
-- Mountable Connection Adapters
-- Mutable ProcessedRequest chain
-- /s/prefetch/stream
-- Removal of all configuration
-- Standard library logging
-- Make Response.json() callable, not property.
-- Usage of new charade project, which provides python 2 and 3 simultaneous chardet.
-- Removal of all hooks except 'response'
-- Removal of all authentication helpers (OAuth, Kerberos)
-
-This is not a backwards compatible change.
-
-0.14.2 (2012-10-27)
-+++++++++++++++++++
-
-- Improved mime-compatible JSON handling
-- Proxy fixes
-- Path hack fixes
-- Case-Insensitive Content-Encoding headers
-- Support for CJK parameters in form posts
-
-
-0.14.1 (2012-10-01)
-+++++++++++++++++++
-
-- Python 3.3 Compatibility
-- Simply default accept-encoding
-- Bugfixes
-
-
-0.14.0 (2012-09-02)
-++++++++++++++++++++
-
-- No more iter_content errors if already downloaded.
-
-0.13.9 (2012-08-25)
-+++++++++++++++++++
-
-- Fix for OAuth + POSTs
-- Remove exception eating from dispatch_hook
-- General bugfixes
-
-0.13.8 (2012-08-21)
-+++++++++++++++++++
-
-- Incredible Link header support :)
-
-0.13.7 (2012-08-19)
-+++++++++++++++++++
-
-- Support for (key, value) lists everywhere.
-- Digest Authentication improvements.
-- Ensure proxy exclusions work properly.
-- Clearer UnicodeError exceptions.
-- Automatic casting of URLs to strings (fURL and such)
-- Bugfixes.
-
-0.13.6 (2012-08-06)
-+++++++++++++++++++
-
-- Long awaited fix for hanging connections!
-
-0.13.5 (2012-07-27)
-+++++++++++++++++++
-
-- Packaging fix
-
-0.13.4 (2012-07-27)
-+++++++++++++++++++
-
-- GSSAPI/Kerberos authentication!
-- App Engine 2.7 Fixes!
-- Fix leaking connections (from urllib3 update)
-- OAuthlib path hack fix
-- OAuthlib URL parameters fix.
-
-0.13.3 (2012-07-12)
-+++++++++++++++++++
-
-- Use simplejson if available.
-- Do not hide SSLErrors behind Timeouts.
-- Fixed param handling with urls containing fragments.
-- Significantly improved information in User Agent.
-- client certificates are ignored when verify=False
-
-0.13.2 (2012-06-28)
-+++++++++++++++++++
-
-- Zero dependencies (once again)!
-- New: Response.reason
-- Sign querystring parameters in OAuth 1.0
-- Client certificates no longer ignored when verify=False
-- Add openSUSE certificate support
-
-0.13.1 (2012-06-07)
-+++++++++++++++++++
-
-- Allow passing a file or file-like object as data.
-- Allow hooks to return responses that indicate errors.
-- Fix Response.text and Response.json for body-less responses.
-
-0.13.0 (2012-05-29)
-+++++++++++++++++++
-
-- Removal of Requests.async in favor of `grequests `_
-- Allow disabling of cookie persistence.
-- New implementation of safe_mode
-- cookies.get now supports default argument
-- Session cookies not saved when Session.request is called with return_response=False
-- Env: no_proxy support.
-- RequestsCookieJar improvements.
-- Various bug fixes.
-
-0.12.1 (2012-05-08)
-+++++++++++++++++++
-
-- New ``Response.json`` property.
-- Ability to add string file uploads.
-- Fix out-of-range issue with iter_lines.
-- Fix iter_content default size.
-- Fix POST redirects containing files.
-
-0.12.0 (2012-05-02)
-+++++++++++++++++++
-
-- EXPERIMENTAL OAUTH SUPPORT!
-- Proper CookieJar-backed cookies interface with awesome dict-like interface.
-- Speed fix for non-iterated content chunks.
-- Move ``pre_request`` to a more usable place.
-- New ``pre_send`` hook.
-- Lazily encode data, params, files.
-- Load system Certificate Bundle if ``certify`` isn't available.
-- Cleanups, fixes.
-
-0.11.2 (2012-04-22)
-+++++++++++++++++++
-
-- Attempt to use the OS's certificate bundle if ``certifi`` isn't available.
-- Infinite digest auth redirect fix.
-- Multi-part file upload improvements.
-- Fix decoding of invalid %encodings in URLs.
-- If there is no content in a response don't throw an error the second time that content is attempted to be read.
-- Upload data on redirects.
-
-0.11.1 (2012-03-30)
-+++++++++++++++++++
-
-* POST redirects now break RFC to do what browsers do: Follow up with a GET.
-* New ``strict_mode`` configuration to disable new redirect behavior.
-
-
-0.11.0 (2012-03-14)
-+++++++++++++++++++
-
-* Private SSL Certificate support
-* Remove select.poll from Gevent monkeypatching
-* Remove redundant generator for chunked transfer encoding
-* Fix: Response.ok raises Timeout Exception in safe_mode
-
-0.10.8 (2012-03-09)
-+++++++++++++++++++
-
-* Generate chunked ValueError fix
-* Proxy configuration by environment variables
-* Simplification of iter_lines.
-* New `trust_env` configuration for disabling system/environment hints.
-* Suppress cookie errors.
-
-0.10.7 (2012-03-07)
-+++++++++++++++++++
-
-* `encode_uri` = False
-
-0.10.6 (2012-02-25)
-+++++++++++++++++++
-
-* Allow '=' in cookies.
-
-0.10.5 (2012-02-25)
-+++++++++++++++++++
-
-* Response body with 0 content-length fix.
-* New async.imap.
-* Don't fail on netrc.
-
-
-0.10.4 (2012-02-20)
-+++++++++++++++++++
-
-* Honor netrc.
-
-0.10.3 (2012-02-20)
-+++++++++++++++++++
-
-* HEAD requests don't follow redirects anymore.
-* raise_for_status() doesn't raise for 3xx anymore.
-* Make Session objects picklable.
-* ValueError for invalid schema URLs.
-
-0.10.2 (2012-01-15)
-+++++++++++++++++++
-
-* Vastly improved URL quoting.
-* Additional allowed cookie key values.
-* Attempted fix for "Too many open files" Error
-* Replace unicode errors on first pass, no need for second pass.
-* Append '/' to bare-domain urls before query insertion.
-* Exceptions now inherit from RuntimeError.
-* Binary uploads + auth fix.
-* Bugfixes.
-
-
-0.10.1 (2012-01-23)
-+++++++++++++++++++
-
-* PYTHON 3 SUPPORT!
-* Dropped 2.5 Support. (*Backwards Incompatible*)
-
-0.10.0 (2012-01-21)
-+++++++++++++++++++
-
-* ``Response.content`` is now bytes-only. (*Backwards Incompatible*)
-* New ``Response.text`` is unicode-only.
-* If no ``Response.encoding`` is specified and ``chardet`` is available, ``Response.text`` will guess an encoding.
-* Default to ISO-8859-1 (Western) encoding for "text" subtypes.
-* Removal of `decode_unicode`. (*Backwards Incompatible*)
-* New multiple-hooks system.
-* New ``Response.register_hook`` for registering hooks within the pipeline.
-* ``Response.url`` is now Unicode.
-
-0.9.3 (2012-01-18)
-++++++++++++++++++
-
-* SSL verify=False bugfix (apparent on windows machines).
-
-0.9.2 (2012-01-18)
-++++++++++++++++++
-
-* Asynchronous async.send method.
-* Support for proper chunk streams with boundaries.
-* session argument for Session classes.
-* Print entire hook tracebacks, not just exception instance.
-* Fix response.iter_lines from pending next line.
-* Fix but in HTTP-digest auth w/ URI having query strings.
-* Fix in Event Hooks section.
-* Urllib3 update.
-
-
-0.9.1 (2012-01-06)
-++++++++++++++++++
-
-* danger_mode for automatic Response.raise_for_status()
-* Response.iter_lines refactor
-
-0.9.0 (2011-12-28)
-++++++++++++++++++
-
-* verify ssl is default.
-
-
-0.8.9 (2011-12-28)
-++++++++++++++++++
-
-* Packaging fix.
-
-
-0.8.8 (2011-12-28)
-++++++++++++++++++
-
-* SSL CERT VERIFICATION!
-* Release of Cerifi: Mozilla's cert list.
-* New 'verify' argument for SSL requests.
-* Urllib3 update.
-
-0.8.7 (2011-12-24)
-++++++++++++++++++
-
-* iter_lines last-line truncation fix
-* Force safe_mode for async requests
-* Handle safe_mode exceptions more consistently
-* Fix iteration on null responses in safe_mode
-
-0.8.6 (2011-12-18)
-++++++++++++++++++
-
-* Socket timeout fixes.
-* Proxy Authorization support.
-
-0.8.5 (2011-12-14)
-++++++++++++++++++
-
-* Response.iter_lines!
-
-0.8.4 (2011-12-11)
-++++++++++++++++++
-
-* Prefetch bugfix.
-* Added license to installed version.
-
-0.8.3 (2011-11-27)
-++++++++++++++++++
-
-* Converted auth system to use simpler callable objects.
-* New session parameter to API methods.
-* Display full URL while logging.
-
-0.8.2 (2011-11-19)
-++++++++++++++++++
-
-* New Unicode decoding system, based on over-ridable `Response.encoding`.
-* Proper URL slash-quote handling.
-* Cookies with ``[``, ``]``, and ``_`` allowed.
-
-0.8.1 (2011-11-15)
-++++++++++++++++++
-
-* URL Request path fix
-* Proxy fix.
-* Timeouts fix.
-
-0.8.0 (2011-11-13)
-++++++++++++++++++
-
-* Keep-alive support!
-* Complete removal of Urllib2
-* Complete removal of Poster
-* Complete removal of CookieJars
-* New ConnectionError raising
-* Safe_mode for error catching
-* prefetch parameter for request methods
-* OPTION method
-* Async pool size throttling
-* File uploads send real names
-* Vendored in urllib3
-
-0.7.6 (2011-11-07)
-++++++++++++++++++
-
-* Digest authentication bugfix (attach query data to path)
-
-0.7.5 (2011-11-04)
-++++++++++++++++++
-
-* Response.content = None if there was an invalid response.
-* Redirection auth handling.
-
-0.7.4 (2011-10-26)
-++++++++++++++++++
-
-* Session Hooks fix.
-
-0.7.3 (2011-10-23)
-++++++++++++++++++
-
-* Digest Auth fix.
-
-
-0.7.2 (2011-10-23)
-++++++++++++++++++
-
-* PATCH Fix.
-
-
-0.7.1 (2011-10-23)
-++++++++++++++++++
-
-* Move away from urllib2 authentication handling.
-* Fully Remove AuthManager, AuthObject, &c.
-* New tuple-based auth system with handler callbacks.
-
-
-0.7.0 (2011-10-22)
-++++++++++++++++++
-
-* Sessions are now the primary interface.
-* Deprecated InvalidMethodException.
-* PATCH fix.
-* New config system (no more global settings).
-
-
-0.6.6 (2011-10-19)
-++++++++++++++++++
-
-* Session parameter bugfix (params merging).
-
-
-0.6.5 (2011-10-18)
-++++++++++++++++++
-
-* Offline (fast) test suite.
-* Session dictionary argument merging.
-
-
-0.6.4 (2011-10-13)
-++++++++++++++++++
-
-* Automatic decoding of unicode, based on HTTP Headers.
-* New ``decode_unicode`` setting.
-* Removal of ``r.read/close`` methods.
-* New ``r.faw`` interface for advanced response usage.*
-* Automatic expansion of parameterized headers.
-
-
-0.6.3 (2011-10-13)
-++++++++++++++++++
-
-* Beautiful ``requests.async`` module, for making async requests w/ gevent.
-
-
-0.6.2 (2011-10-09)
-++++++++++++++++++
-
-* GET/HEAD obeys allow_redirects=False.
-
-
-0.6.1 (2011-08-20)
-++++++++++++++++++
-
-* Enhanced status codes experience ``\o/``
-* Set a maximum number of redirects (``settings.max_redirects``)
-* Full Unicode URL support
-* Support for protocol-less redirects.
-* Allow for arbitrary request types.
-* Bugfixes
-
-
-0.6.0 (2011-08-17)
-++++++++++++++++++
-
-* New callback hook system
-* New persistent sessions object and context manager
-* Transparent Dict-cookie handling
-* Status code reference object
-* Removed Response.cached
-* Added Response.request
-* All args are kwargs
-* Relative redirect support
-* HTTPError handling improvements
-* Improved https testing
-* Bugfixes
-
-
-0.5.1 (2011-07-23)
-++++++++++++++++++
-
-* International Domain Name Support!
-* Access headers without fetching entire body (``read()``)
-* Use lists as dicts for parameters
-* Add Forced Basic Authentication
-* Forced Basic is default authentication type
-* ``python-requests.org`` default User-Agent header
-* CaseInsensitiveDict lower-case caching
-* Response.history bugfix
-
-
-0.5.0 (2011-06-21)
-++++++++++++++++++
-
-* PATCH Support
-* Support for Proxies
-* HTTPBin Test Suite
-* Redirect Fixes
-* settings.verbose stream writing
-* Querystrings for all methods
-* URLErrors (Connection Refused, Timeout, Invalid URLs) are treated as explicitly raised
- ``r.requests.get('hwe://blah'); r.raise_for_status()``
-
-
-0.4.1 (2011-05-22)
-++++++++++++++++++
-
-* Improved Redirection Handling
-* New 'allow_redirects' param for following non-GET/HEAD Redirects
-* Settings module refactoring
-
-
-0.4.0 (2011-05-15)
-++++++++++++++++++
-
-* Response.history: list of redirected responses
-* Case-Insensitive Header Dictionaries!
-* Unicode URLs
-
-
-0.3.4 (2011-05-14)
-++++++++++++++++++
-
-* Urllib2 HTTPAuthentication Recursion fix (Basic/Digest)
-* Internal Refactor
-* Bytes data upload Bugfix
-
-
-
-0.3.3 (2011-05-12)
-++++++++++++++++++
-
-* Request timeouts
-* Unicode url-encoded data
-* Settings context manager and module
-
-
-0.3.2 (2011-04-15)
-++++++++++++++++++
-
-* Automatic Decompression of GZip Encoded Content
-* AutoAuth Support for Tupled HTTP Auth
-
-
-0.3.1 (2011-04-01)
-++++++++++++++++++
-
-* Cookie Changes
-* Response.read()
-* Poster fix
-
-
-0.3.0 (2011-02-25)
-++++++++++++++++++
-
-* Automatic Authentication API Change
-* Smarter Query URL Parameterization
-* Allow file uploads and POST data together
-* New Authentication Manager System
- - Simpler Basic HTTP System
- - Supports all build-in urllib2 Auths
- - Allows for custom Auth Handlers
-
-
-0.2.4 (2011-02-19)
-++++++++++++++++++
-
-* Python 2.5 Support
-* PyPy-c v1.4 Support
-* Auto-Authentication tests
-* Improved Request object constructor
-
-0.2.3 (2011-02-15)
-++++++++++++++++++
-
-* New HTTPHandling Methods
- - Response.__nonzero__ (false if bad HTTP Status)
- - Response.ok (True if expected HTTP Status)
- - Response.error (Logged HTTPError if bad HTTP Status)
- - Response.raise_for_status() (Raises stored HTTPError)
-
-
-0.2.2 (2011-02-14)
-++++++++++++++++++
-
-* Still handles request in the event of an HTTPError. (Issue #2)
-* Eventlet and Gevent Monkeypatch support.
-* Cookie Support (Issue #1)
-
-
-0.2.1 (2011-02-14)
-++++++++++++++++++
-
-* Added file attribute to POST and PUT requests for multipart-encode file uploads.
-* Added Request.url attribute for context and redirects
-
-
-0.2.0 (2011-02-14)
-++++++++++++++++++
-
-* Birth!
-
-
-0.0.1 (2011-02-13)
-++++++++++++++++++
-
-* Frustration
-* Conception
diff --git a/LICENSE b/LICENSE
index db78ea69f4..841c6023b9 100644
--- a/LICENSE
+++ b/LICENSE
@@ -1,10 +1,10 @@
-Copyright 2017 Kenneth Reitz
+Copyright 2018 Kenneth Reitz
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
- http://www.apache.org/licenses/LICENSE-2.0
+ https://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
diff --git a/MANIFEST.in b/MANIFEST.in
index 2c0fb95ce9..172ccc8412 100644
--- a/MANIFEST.in
+++ b/MANIFEST.in
@@ -1,2 +1,2 @@
-include README.rst LICENSE NOTICE HISTORY.rst pytest.ini requirements.txt
+include README.md LICENSE NOTICE HISTORY.md pytest.ini requirements.txt Pipfile Pipfile.lock
recursive-include tests *.py
diff --git a/Makefile b/Makefile
index 92d1c34429..317a7c76fb 100644
--- a/Makefile
+++ b/Makefile
@@ -1,20 +1,21 @@
.PHONY: docs
init:
- pip install -r requirements.txt
+ pip install pipenv --upgrade
+ pipenv install --dev --skip-lock
test:
# This runs all of the tests, on both Python 2 and Python 3.
detox
ci:
- py.test -n 8 --boxed --junitxml=report.xml
+ pipenv run py.test -n 8 --boxed --junitxml=report.xml
test-readme:
- @python setup.py check --restructuredtext --strict && ([ $$? -eq 0 ] && echo "README.rst and HISTORY.rst ok") || echo "Invalid markup in README.rst or HISTORY.rst!"
+ @pipenv run python setup.py check --restructuredtext --strict && ([ $$? -eq 0 ] && echo "README.rst and HISTORY.rst ok") || echo "Invalid markup in README.rst or HISTORY.rst!"
flake8:
- flake8 --ignore=E501,F401,E128,E402,E731,F821 requests
+ pipenv run flake8 --ignore=E501,F401,E128,E402,E731,F821 requests
coverage:
- py.test --cov-config .coveragerc --verbose --cov-report term --cov-report xml --cov=requests tests
+ pipenv run py.test --cov-config .coveragerc --verbose --cov-report term --cov-report xml --cov=requests tests
publish:
pip install 'twine>=1.5.0'
diff --git a/Pipfile b/Pipfile
new file mode 100644
index 0000000000..3e0fd729eb
--- /dev/null
+++ b/Pipfile
@@ -0,0 +1,24 @@
+[[source]]
+url = "https://pypi.org/simple/"
+verify_ssl = true
+name = "pypi"
+
+[dev-packages]
+pytest = ">=2.8.0"
+codecov = "*"
+pytest-httpbin = ">=0.0.7"
+pytest-mock = "*"
+pytest-cov = "*"
+pytest-xdist = "*"
+alabaster = "*"
+readme-renderer = "*"
+sphinx = "<=1.5.5"
+pysocks = "*"
+docutils = "*"
+"flake8" = "*"
+tox = "*"
+detox = "*"
+httpbin = ">=0.7.0"
+
+[packages]
+"e1839a8" = {path = ".", editable = true, extras = ["socks"]}
diff --git a/Pipfile.lock b/Pipfile.lock
new file mode 100644
index 0000000000..e712de41ab
--- /dev/null
+++ b/Pipfile.lock
@@ -0,0 +1,641 @@
+{
+ "_meta": {
+ "hash": {
+ "sha256": "0735b243455d8e924fbd05188ed435bfd3917f4acdade9b9e8f14741f3fc47e9"
+ },
+ "pipfile-spec": 6,
+ "requires": {},
+ "sources": [
+ {
+ "name": "pypi",
+ "url": "https://pypi.org/simple/",
+ "verify_ssl": true
+ }
+ ]
+ },
+ "default": {
+ "certifi": {
+ "hashes": [
+ "sha256:13e698f54293db9f89122b0581843a782ad0934a4fe0172d2a980ba77fc61bb7",
+ "sha256:9fa520c1bacfb634fa7af20a76bcbd3d5fb390481724c597da32c719a7dca4b0"
+ ],
+ "version": "==2018.4.16"
+ },
+ "chardet": {
+ "hashes": [
+ "sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae",
+ "sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691"
+ ],
+ "version": "==3.0.4"
+ },
+ "e1839a8": {
+ "editable": true,
+ "extras": [
+ "socks"
+ ],
+ "path": "."
+ },
+ "idna": {
+ "hashes": [
+ "sha256:2c6a5de3089009e3da7c5dde64a141dbc8551d5b7f6cf4ed7c2568d0cc520a8f",
+ "sha256:8c7309c718f94b3a625cb648ace320157ad16ff131ae0af362c9f21b80ef6ec4"
+ ],
+ "version": "==2.6"
+ },
+ "pysocks": {
+ "hashes": [
+ "sha256:3fe52c55890a248676fd69dc9e3c4e811718b777834bcaab7a8125cf9deac672"
+ ],
+ "version": "==1.6.8"
+ },
+ "urllib3": {
+ "hashes": [
+ "sha256:06330f386d6e4b195fbfc736b297f58c5a892e4440e54d294d7004e3a9bbea1b",
+ "sha256:cc44da8e1145637334317feebd728bd869a35285b93cbb4cca2577da7e62db4f"
+ ],
+ "version": "==1.22"
+ }
+ },
+ "develop": {
+ "alabaster": {
+ "hashes": [
+ "sha256:2eef172f44e8d301d25aff8068fddd65f767a3f04b5f15b0f4922f113aa1c732",
+ "sha256:37cdcb9e9954ed60912ebc1ca12a9d12178c26637abdf124e3cde2341c257fe0"
+ ],
+ "index": "pypi",
+ "version": "==0.7.10"
+ },
+ "apipkg": {
+ "hashes": [
+ "sha256:2e38399dbe842891fe85392601aab8f40a8f4cc5a9053c326de35a1cc0297ac6",
+ "sha256:65d2aa68b28e7d31233bb2ba8eb31cda40e4671f8ac2d6b241e358c9652a74b9"
+ ],
+ "version": "==1.4"
+ },
+ "attrs": {
+ "hashes": [
+ "sha256:4b90b09eeeb9b88c35bc642cbac057e45a5fd85367b985bd2809c62b7b939265",
+ "sha256:e0d0eb91441a3b53dab4d9b743eafc1ac44476296a2053b6ca3af0b139faf87b"
+ ],
+ "version": "==18.1.0"
+ },
+ "babel": {
+ "hashes": [
+ "sha256:8ce4cb6fdd4393edd323227cba3a077bceb2a6ce5201c902c65e730046f41f14",
+ "sha256:ad209a68d7162c4cff4b29cdebe3dec4cef75492df501b0049a9433c96ce6f80"
+ ],
+ "version": "==2.5.3"
+ },
+ "bleach": {
+ "hashes": [
+ "sha256:b8fa79e91f96c2c2cd9fd1f9eda906efb1b88b483048978ba62fef680e962b34",
+ "sha256:eb7386f632349d10d9ce9d4a838b134d4731571851149f9cc2c05a9a837a9a44"
+ ],
+ "version": "==2.1.3"
+ },
+ "blinker": {
+ "hashes": [
+ "sha256:471aee25f3992bd325afa3772f1063dbdbbca947a041b8b89466dc00d606f8b6"
+ ],
+ "version": "==1.4"
+ },
+ "brotlipy": {
+ "hashes": [
+ "sha256:07194f4768eb62a4f4ea76b6d0df6ade185e24ebd85877c351daa0a069f1111a",
+ "sha256:091b299bf36dd6ef7a06570dbc98c0f80a504a56c5b797f31934d2ad01ae7d17",
+ "sha256:09ec3e125d16749b31c74f021aba809541b3564e5359f8c265cbae442810b41a",
+ "sha256:0be698678a114addcf87a4b9496c552c68a2c99bf93cf8e08f5738b392e82057",
+ "sha256:0fa6088a9a87645d43d7e21e32b4a6bf8f7c3939015a50158c10972aa7f425b7",
+ "sha256:1ea4e578241504b58f2456a6c69952c88866c794648bdc74baee74839da61d44",
+ "sha256:2699945a0a992c04fc7dc7fa2f1d0575a2c8b4b769f2874a08e8eae46bef36ae",
+ "sha256:2a80319ae13ea8dd60ecdc4f5ccf6da3ae64787765923256b62c598c5bba4121",
+ "sha256:2e5c64522364a9ebcdf47c5744a5ddeb3f934742d31e61ebfbbc095460b47162",
+ "sha256:36def0b859beaf21910157b4c33eb3b06d8ce459c942102f16988cca6ea164df",
+ "sha256:3a3e56ced8b15fbbd363380344f70f3b438e0fd1fcf27b7526b6172ea950e867",
+ "sha256:3c1d5e2cf945a46975bdb11a19257fa057b67591eb232f393d260e7246d9e571",
+ "sha256:50ca336374131cfad20612f26cc43c637ac0bfd2be3361495e99270883b52962",
+ "sha256:5de6f7d010b7558f72f4b061a07395c5c3fd57f0285c5af7f126a677b976a868",
+ "sha256:637847560d671657f993313ecc6c6c6666a936b7a925779fd044065c7bc035b9",
+ "sha256:653faef61241bf8bf99d73ca7ec4baa63401ba7b2a2aa88958394869379d67c7",
+ "sha256:786afc8c9bd67de8d31f46e408a3386331e126829114e4db034f91eacb05396d",
+ "sha256:79aaf217072840f3e9a3b641cccc51f7fc23037496bd71e26211856b93f4b4cb",
+ "sha256:7e31f7adcc5851ca06134705fcf3478210da45d35ad75ec181e1ce9ce345bb38",
+ "sha256:8b39abc3256c978f575df5cd7893153277216474f303e26f0e43ba3d3969ef96",
+ "sha256:9448227b0df082e574c45c983fa5cd4bda7bfb11ea6b59def0940c1647be0c3c",
+ "sha256:96bc59ff9b5b5552843dc67999486a220e07a0522dddd3935da05dc194fa485c",
+ "sha256:a07647886e24e2fb2d68ca8bf3ada398eb56fd8eac46c733d4d95c64d17f743b",
+ "sha256:af65d2699cb9f13b26ec3ba09e75e80d31ff422c03675fcb36ee4dabe588fdc2",
+ "sha256:b4c98b0d2c9c7020a524ca5bbff42027db1004c6571f8bc7b747f2b843128e7a",
+ "sha256:c6cc0036b1304dd0073eec416cb2f6b9e37ac8296afd9e481cac3b1f07f9db25",
+ "sha256:d2c1c724c4ac375feb2110f1af98ecdc0e5a8ea79d068efb5891f621a5b235cb",
+ "sha256:dc6c5ee0df9732a44d08edab32f8a616b769cc5a4155a12d2d010d248eb3fb07",
+ "sha256:fd1d1c64214af5d90014d82cee5d8141b13d44c92ada7a0c0ec0679c6f15a471"
+ ],
+ "version": "==0.7.0"
+ },
+ "certifi": {
+ "hashes": [
+ "sha256:13e698f54293db9f89122b0581843a782ad0934a4fe0172d2a980ba77fc61bb7",
+ "sha256:9fa520c1bacfb634fa7af20a76bcbd3d5fb390481724c597da32c719a7dca4b0"
+ ],
+ "version": "==2018.4.16"
+ },
+ "cffi": {
+ "hashes": [
+ "sha256:151b7eefd035c56b2b2e1eb9963c90c6302dc15fbd8c1c0a83a163ff2c7d7743",
+ "sha256:1553d1e99f035ace1c0544050622b7bc963374a00c467edafac50ad7bd276aef",
+ "sha256:1b0493c091a1898f1136e3f4f991a784437fac3673780ff9de3bcf46c80b6b50",
+ "sha256:2ba8a45822b7aee805ab49abfe7eec16b90587f7f26df20c71dd89e45a97076f",
+ "sha256:3c85641778460581c42924384f5e68076d724ceac0f267d66c757f7535069c93",
+ "sha256:3eb6434197633b7748cea30bf0ba9f66727cdce45117a712b29a443943733257",
+ "sha256:4c91af6e967c2015729d3e69c2e51d92f9898c330d6a851bf8f121236f3defd3",
+ "sha256:770f3782b31f50b68627e22f91cb182c48c47c02eb405fd689472aa7b7aa16dc",
+ "sha256:79f9b6f7c46ae1f8ded75f68cf8ad50e5729ed4d590c74840471fc2823457d04",
+ "sha256:7a33145e04d44ce95bcd71e522b478d282ad0eafaf34fe1ec5bbd73e662f22b6",
+ "sha256:857959354ae3a6fa3da6651b966d13b0a8bed6bbc87a0de7b38a549db1d2a359",
+ "sha256:87f37fe5130574ff76c17cab61e7d2538a16f843bb7bca8ebbc4b12de3078596",
+ "sha256:95d5251e4b5ca00061f9d9f3d6fe537247e145a8524ae9fd30a2f8fbce993b5b",
+ "sha256:9d1d3e63a4afdc29bd76ce6aa9d58c771cd1599fbba8cf5057e7860b203710dd",
+ "sha256:a36c5c154f9d42ec176e6e620cb0dd275744aa1d804786a71ac37dc3661a5e95",
+ "sha256:ae5e35a2c189d397b91034642cb0eab0e346f776ec2eb44a49a459e6615d6e2e",
+ "sha256:b0f7d4a3df8f06cf49f9f121bead236e328074de6449866515cea4907bbc63d6",
+ "sha256:b75110fb114fa366b29a027d0c9be3709579602ae111ff61674d28c93606acca",
+ "sha256:ba5e697569f84b13640c9e193170e89c13c6244c24400fc57e88724ef610cd31",
+ "sha256:be2a9b390f77fd7676d80bc3cdc4f8edb940d8c198ed2d8c0be1319018c778e1",
+ "sha256:d5d8555d9bfc3f02385c1c37e9f998e2011f0db4f90e250e5bc0c0a85a813085",
+ "sha256:e55e22ac0a30023426564b1059b035973ec82186ddddbac867078435801c7801",
+ "sha256:e90f17980e6ab0f3c2f3730e56d1fe9bcba1891eeea58966e89d352492cc74f4",
+ "sha256:ecbb7b01409e9b782df5ded849c178a0aa7c906cf8c5a67368047daab282b184",
+ "sha256:ed01918d545a38998bfa5902c7c00e0fee90e957ce036a4000a88e3fe2264917",
+ "sha256:edabd457cd23a02965166026fd9bfd196f4324fe6032e866d0f3bd0301cd486f",
+ "sha256:fdf1c1dc5bafc32bc5d08b054f94d659422b05aba244d6be4ddc1c72d9aa70fb"
+ ],
+ "version": "==1.11.5"
+ },
+ "chardet": {
+ "hashes": [
+ "sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae",
+ "sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691"
+ ],
+ "version": "==3.0.4"
+ },
+ "click": {
+ "hashes": [
+ "sha256:29f99fc6125fbc931b758dc053b3114e55c77a6e4c6c3a2674a2dc986016381d",
+ "sha256:f15516df478d5a56180fbf80e68f206010e6d160fc39fa508b65e035fd75130b"
+ ],
+ "version": "==6.7"
+ },
+ "cmarkgfm": {
+ "hashes": [
+ "sha256:0186dccca79483e3405217993b83b914ba4559fe9a8396efc4eea56561b74061",
+ "sha256:1a625afc6f62da428df96ec325dc30866cc5781520cbd904ff4ec44cf018171c",
+ "sha256:275905bb371a99285c74931700db3f0c078e7603bed383e8cf1a09f3ee05a3de",
+ "sha256:50098f1c4950722521f0671e54139e0edc1837d63c990cf0f3d2c49607bb51a2",
+ "sha256:50ed116d0b60a07df0dc7b180c28569064b9d37d1578d4c9021cff04d725cb63",
+ "sha256:61a72def110eed903cd1848245897bcb80d295cd9d13944d4f9f30cba5b76655",
+ "sha256:64186fb75d973a06df0e6ea12879533b71f6e7ba1ab01ffee7fc3e7534758889",
+ "sha256:665303d34d7f14f10d7b0651082f25ebf7107f29ef3d699490cac16cdc0fc8ce",
+ "sha256:70b18f843aec58e4e64aadce48a897fe7c50426718b7753aaee399e72df64190",
+ "sha256:761ee7b04d1caee2931344ac6bfebf37102ffb203b136b676b0a71a3f0ea3c87",
+ "sha256:811527e9b7280b136734ed6cb6845e5fbccaeaa132ddf45f0246cbe544016957",
+ "sha256:987b0e157f70c72a84f3c2f9ef2d7ab0f26c08f2bf326c12c087ff9eebcb3ff5",
+ "sha256:9fc6a2183d0a9b0974ec7cdcdad42bd78a3be674cc3e65f87dd694419b3b0ab7",
+ "sha256:c573ea89dd95d41b6d8cf36799c34b6d5b1eac4aed0212dee0f0a11fb7b01e8f",
+ "sha256:c5f1b9e8592d2c448c44e6bc0d91224b16ea5f8293908b1561de1f6d2d0658b1",
+ "sha256:cbe581456357d8f0674d6a590b1aaf46c11d01dd0a23af147a51a798c3818034",
+ "sha256:cf219bec69e601fe27e3974b7307d2f06082ab385d42752738ad2eb630a47d65",
+ "sha256:d08bad67fa18f7e8ff738c090628ee0cbf0505d74a991c848d6d04abfe67b697",
+ "sha256:d6f716d7b1182bf35862b5065112f933f43dd1aa4f8097c9bcfb246f71528a34",
+ "sha256:e08e479102627641c7cb4ece421c6ed4124820b1758765db32201136762282d9",
+ "sha256:e20ac21418af0298437d29599f7851915497ce9f2866bc8e86b084d8911ee061",
+ "sha256:e25f53c37e319241b9a412382140dffac98ca756ba8f360ac7ab5e30cad9670a",
+ "sha256:f20900f16377f2109783ae9348d34bc80530808439591c3d3df73d5c7ef1a00c"
+ ],
+ "version": "==0.4.2"
+ },
+ "codecov": {
+ "hashes": [
+ "sha256:8ed8b7c6791010d359baed66f84f061bba5bd41174bf324c31311e8737602788",
+ "sha256:ae00d68e18d8a20e9c3288ba3875ae03db3a8e892115bf9b83ef20507732bed4"
+ ],
+ "index": "pypi",
+ "version": "==2.0.15"
+ },
+ "configparser": {
+ "hashes": [
+ "sha256:5308b47021bc2340965c371f0f058cc6971a04502638d4244225c49d80db273a"
+ ],
+ "markers": "python_version < '3.2'",
+ "version": "==3.5.0"
+ },
+ "contextlib2": {
+ "hashes": [
+ "sha256:509f9419ee91cdd00ba34443217d5ca51f5a364a404e1dce9e8979cea969ca48",
+ "sha256:f5260a6e679d2ff42ec91ec5252f4eeffdcf21053db9113bd0a8e4d953769c00"
+ ],
+ "markers": "python_version < '3.2'",
+ "version": "==0.5.5"
+ },
+ "coverage": {
+ "hashes": [
+ "sha256:03481e81d558d30d230bc12999e3edffe392d244349a90f4ef9b88425fac74ba",
+ "sha256:0b136648de27201056c1869a6c0d4e23f464750fd9a9ba9750b8336a244429ed",
+ "sha256:104ab3934abaf5be871a583541e8829d6c19ce7bde2923b2751e0d3ca44db60a",
+ "sha256:15b111b6a0f46ee1a485414a52a7ad1d703bdf984e9ed3c288a4414d3871dcbd",
+ "sha256:198626739a79b09fa0a2f06e083ffd12eb55449b5f8bfdbeed1df4910b2ca640",
+ "sha256:1c383d2ef13ade2acc636556fd544dba6e14fa30755f26812f54300e401f98f2",
+ "sha256:28b2191e7283f4f3568962e373b47ef7f0392993bb6660d079c62bd50fe9d162",
+ "sha256:2eb564bbf7816a9d68dd3369a510be3327f1c618d2357fa6b1216994c2e3d508",
+ "sha256:337ded681dd2ef9ca04ef5d93cfc87e52e09db2594c296b4a0a3662cb1b41249",
+ "sha256:3a2184c6d797a125dca8367878d3b9a178b6fdd05fdc2d35d758c3006a1cd694",
+ "sha256:3c79a6f7b95751cdebcd9037e4d06f8d5a9b60e4ed0cd231342aa8ad7124882a",
+ "sha256:3d72c20bd105022d29b14a7d628462ebdc61de2f303322c0212a054352f3b287",
+ "sha256:3eb42bf89a6be7deb64116dd1cc4b08171734d721e7a7e57ad64cc4ef29ed2f1",
+ "sha256:4635a184d0bbe537aa185a34193898eee409332a8ccb27eea36f262566585000",
+ "sha256:56e448f051a201c5ebbaa86a5efd0ca90d327204d8b059ab25ad0f35fbfd79f1",
+ "sha256:5a13ea7911ff5e1796b6d5e4fbbf6952381a611209b736d48e675c2756f3f74e",
+ "sha256:69bf008a06b76619d3c3f3b1983f5145c75a305a0fea513aca094cae5c40a8f5",
+ "sha256:6bc583dc18d5979dc0f6cec26a8603129de0304d5ae1f17e57a12834e7235062",
+ "sha256:701cd6093d63e6b8ad7009d8a92425428bc4d6e7ab8d75efbb665c806c1d79ba",
+ "sha256:7608a3dd5d73cb06c531b8925e0ef8d3de31fed2544a7de6c63960a1e73ea4bc",
+ "sha256:76ecd006d1d8f739430ec50cc872889af1f9c1b6b8f48e29941814b09b0fd3cc",
+ "sha256:7aa36d2b844a3e4a4b356708d79fd2c260281a7390d678a10b91ca595ddc9e99",
+ "sha256:7d3f553904b0c5c016d1dad058a7554c7ac4c91a789fca496e7d8347ad040653",
+ "sha256:7e1fe19bd6dce69d9fd159d8e4a80a8f52101380d5d3a4d374b6d3eae0e5de9c",
+ "sha256:8c3cb8c35ec4d9506979b4cf90ee9918bc2e49f84189d9bf5c36c0c1119c6558",
+ "sha256:9d6dd10d49e01571bf6e147d3b505141ffc093a06756c60b053a859cb2128b1f",
+ "sha256:9e112fcbe0148a6fa4f0a02e8d58e94470fc6cb82a5481618fea901699bf34c4",
+ "sha256:ac4fef68da01116a5c117eba4dd46f2e06847a497de5ed1d64bb99a5fda1ef91",
+ "sha256:b8815995e050764c8610dbc82641807d196927c3dbed207f0a079833ffcf588d",
+ "sha256:be6cfcd8053d13f5f5eeb284aa8a814220c3da1b0078fa859011c7fffd86dab9",
+ "sha256:c1bb572fab8208c400adaf06a8133ac0712179a334c09224fb11393e920abcdd",
+ "sha256:de4418dadaa1c01d497e539210cb6baa015965526ff5afc078c57ca69160108d",
+ "sha256:e05cb4d9aad6233d67e0541caa7e511fa4047ed7750ec2510d466e806e0255d6",
+ "sha256:e4d96c07229f58cb686120f168276e434660e4358cc9cf3b0464210b04913e77",
+ "sha256:f3f501f345f24383c0000395b26b726e46758b71393267aeae0bd36f8b3ade80",
+ "sha256:f8a923a85cb099422ad5a2e345fe877bbc89a8a8b23235824a93488150e45f6e"
+ ],
+ "version": "==4.5.1"
+ },
+ "decorator": {
+ "hashes": [
+ "sha256:2c51dff8ef3c447388fe5e4453d24a2bf128d3a4c32af3fabef1f01c6851ab82",
+ "sha256:c39efa13fbdeb4506c476c9b3babf6a718da943dab7811c206005a4a956c080c"
+ ],
+ "version": "==4.3.0"
+ },
+ "detox": {
+ "hashes": [
+ "sha256:cb24895a0e4f95c0bcb1087a201c453600e075568af00848e91518fb2b984568",
+ "sha256:f3119bca4444f1e8a1d7189b064c52cfdd9a89ad3a1c921d78b49bf7f5dc5b1b"
+ ],
+ "index": "pypi",
+ "version": "==0.12"
+ },
+ "docutils": {
+ "hashes": [
+ "sha256:02aec4bd92ab067f6ff27a38a38a41173bf01bed8f89157768c1573f53e474a6",
+ "sha256:51e64ef2ebfb29cae1faa133b3710143496eca21c530f3f71424d77687764274",
+ "sha256:7a4bd47eaf6596e1295ecb11361139febe29b084a87bf005bf899f9a42edc3c6"
+ ],
+ "index": "pypi",
+ "version": "==0.14"
+ },
+ "enum34": {
+ "hashes": [
+ "sha256:2d81cbbe0e73112bdfe6ef8576f2238f2ba27dd0d55752a776c41d38b7da2850",
+ "sha256:644837f692e5f550741432dd3f223bbb9852018674981b1664e5dc339387588a",
+ "sha256:6bd0f6ad48ec2aa117d3d141940d484deccda84d4fcd884f5c3d93c23ecd8c79",
+ "sha256:8ad8c4783bf61ded74527bffb48ed9b54166685e4230386a9ed9b1279e2df5b1"
+ ],
+ "markers": "python_version < '3.4'",
+ "version": "==1.1.6"
+ },
+ "eventlet": {
+ "hashes": [
+ "sha256:06cffa55b335cc4fc32d0079242a81e8a9cddf2581d64d5f0543e2d412b26ca8",
+ "sha256:554a50dad7abee0a9775b0780ce9d9c0bd9123dda4743c46d4314170267c6c47"
+ ],
+ "version": "==0.23.0"
+ },
+ "execnet": {
+ "hashes": [
+ "sha256:a7a84d5fa07a089186a329528f127c9d73b9de57f1a1131b82bb5320ee651f6a",
+ "sha256:fc155a6b553c66c838d1a22dba1dc9f5f505c43285a878c6f74a79c024750b83"
+ ],
+ "version": "==1.5.0"
+ },
+ "flake8": {
+ "hashes": [
+ "sha256:7253265f7abd8b313e3892944044a365e3f4ac3fcdcfb4298f55ee9ddf188ba0",
+ "sha256:c7841163e2b576d435799169b78703ad6ac1bbb0f199994fc05f700b2a90ea37"
+ ],
+ "index": "pypi",
+ "version": "==3.5.0"
+ },
+ "flask": {
+ "hashes": [
+ "sha256:2271c0070dbcb5275fad4a82e29f23ab92682dc45f9dfbc22c02ba9b9322ce48",
+ "sha256:a080b744b7e345ccfcbc77954861cb05b3c63786e93f2b3875e0913d44b43f05"
+ ],
+ "version": "==1.0.2"
+ },
+ "funcsigs": {
+ "hashes": [
+ "sha256:330cc27ccbf7f1e992e69fef78261dc7c6569012cf397db8d3de0234e6c937ca",
+ "sha256:a7bb0f2cf3a3fd1ab2732cb49eba4252c2af4240442415b4abce3b87022a8f50"
+ ],
+ "markers": "python_version < '3.3'",
+ "version": "==1.0.2"
+ },
+ "future": {
+ "hashes": [
+ "sha256:e39ced1ab767b5936646cedba8bcce582398233d6a627067d4c6a454c90cfedb"
+ ],
+ "version": "==0.16.0"
+ },
+ "greenlet": {
+ "hashes": [
+ "sha256:09ef2636ea35782364c830f07127d6c7a70542b178268714a9a9ba16318e7e8b",
+ "sha256:0fef83d43bf87a5196c91e73cb9772f945a4caaff91242766c5916d1dd1381e4",
+ "sha256:1b7df09c6598f5cfb40f843ade14ed1eb40596e75cd79b6fa2efc750ba01bb01",
+ "sha256:1fff21a2da5f9e03ddc5bd99131a6b8edf3d7f9d6bc29ba21784323d17806ed7",
+ "sha256:42118bf608e0288e35304b449a2d87e2ba77d1e373e8aa221ccdea073de026fa",
+ "sha256:50643fd6d54fd919f9a0a577c5f7b71f5d21f0959ab48767bd4bb73ae0839500",
+ "sha256:58798b5d30054bb4f6cf0f712f08e6092df23a718b69000786634a265e8911a9",
+ "sha256:5b49b3049697aeae17ef7bf21267e69972d9e04917658b4e788986ea5cc518e8",
+ "sha256:75c413551a436b462d5929255b6dc9c0c3c2b25cbeaee5271a56c7fda8ca49c0",
+ "sha256:769b740aeebd584cd59232be84fdcaf6270b8adc356596cdea5b2152c82caaac",
+ "sha256:ad2383d39f13534f3ca5c48fe1fc0975676846dc39c2cece78c0f1f9891418e0",
+ "sha256:b417bb7ff680d43e7bd7a13e2e08956fa6acb11fd432f74c97b7664f8bdb6ec1",
+ "sha256:b6ef0cabaf5a6ecb5ac122e689d25ba12433a90c7b067b12e5f28bdb7fb78254",
+ "sha256:c2de19c88bdb0366c976cc125dca1002ec1b346989d59524178adfd395e62421",
+ "sha256:c7b04a6dc74087b1598de8d713198de4718fa30ec6cbb84959b26426c198e041",
+ "sha256:f8f2a0ae8de0b49c7b5b2daca4f150fdd9c1173e854df2cce3b04123244f9f45",
+ "sha256:fcfadaf4bf68a27e5dc2f42cbb2f4b4ceea9f05d1d0b8f7787e640bed2801634"
+ ],
+ "version": "==0.4.13"
+ },
+ "html5lib": {
+ "hashes": [
+ "sha256:20b159aa3badc9d5ee8f5c647e5efd02ed2a66ab8d354930bd9ff139fc1dc0a3",
+ "sha256:66cb0dcfdbbc4f9c3ba1a63fdb511ffdbd4f513b2b6d81b80cd26ce6b3fb3736"
+ ],
+ "version": "==1.0.1"
+ },
+ "httpbin": {
+ "hashes": [
+ "sha256:7a04b5904c80b7aa04dd0a6af6520d68ce17a5db175e66a64b971f8e93d73a26",
+ "sha256:cbb37790c91575f4f15757f42ad41d9f729eb227d5edbe89e4ec175486db8dfa"
+ ],
+ "index": "pypi",
+ "version": "==0.7.0"
+ },
+ "idna": {
+ "hashes": [
+ "sha256:2c6a5de3089009e3da7c5dde64a141dbc8551d5b7f6cf4ed7c2568d0cc520a8f",
+ "sha256:8c7309c718f94b3a625cb648ace320157ad16ff131ae0af362c9f21b80ef6ec4"
+ ],
+ "version": "==2.6"
+ },
+ "imagesize": {
+ "hashes": [
+ "sha256:3620cc0cadba3f7475f9940d22431fc4d407269f1be59ec9b8edcca26440cf18",
+ "sha256:5b326e4678b6925158ccc66a9fa3122b6106d7c876ee32d7de6ce59385b96315"
+ ],
+ "version": "==1.0.0"
+ },
+ "itsdangerous": {
+ "hashes": [
+ "sha256:cbb3fcf8d3e33df861709ecaf89d9e6629cff0a217bc2848f1b41cd30d360519"
+ ],
+ "version": "==0.24"
+ },
+ "jinja2": {
+ "hashes": [
+ "sha256:74c935a1b8bb9a3947c50a54766a969d4846290e1e788ea44c1392163723c3bd",
+ "sha256:f84be1bb0040caca4cea721fcbbbbd61f9be9464ca236387158b0feea01914a4"
+ ],
+ "version": "==2.10"
+ },
+ "markupsafe": {
+ "hashes": [
+ "sha256:a6be69091dac236ea9c6bc7d012beab42010fa914c459791d627dad4910eb665"
+ ],
+ "version": "==1.0"
+ },
+ "mccabe": {
+ "hashes": [
+ "sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42",
+ "sha256:dd8d182285a0fe56bace7f45b5e7d1a6ebcbf524e8f3bd87eb0f125271b8831f"
+ ],
+ "version": "==0.6.1"
+ },
+ "mock": {
+ "hashes": [
+ "sha256:5ce3c71c5545b472da17b72268978914d0252980348636840bd34a00b5cc96c1",
+ "sha256:b158b6df76edd239b8208d481dc46b6afd45a846b7812ff0ce58971cf5bc8bba"
+ ],
+ "markers": "python_version < '3.0'",
+ "version": "==2.0.0"
+ },
+ "more-itertools": {
+ "hashes": [
+ "sha256:0dd8f72eeab0d2c3bd489025bb2f6a1b8342f9b198f6fc37b52d15cfa4531fea",
+ "sha256:11a625025954c20145b37ff6309cd54e39ca94f72f6bb9576d1195db6fa2442e",
+ "sha256:c9ce7eccdcb901a2c75d326ea134e0886abfbea5f93e91cc95de9507c0816c44"
+ ],
+ "version": "==4.1.0"
+ },
+ "pbr": {
+ "hashes": [
+ "sha256:680bf5ba9b28dd56e08eb7c267991a37c7a5f90a92c2e07108829931a50ff80a",
+ "sha256:6874feb22334a1e9a515193cba797664e940b763440c88115009ec323a7f2df5"
+ ],
+ "version": "==4.0.3"
+ },
+ "pluggy": {
+ "hashes": [
+ "sha256:7f8ae7f5bdf75671a718d2daf0a64b7885f74510bcd98b1a0bb420eb9a9d0cff",
+ "sha256:d345c8fe681115900d6da8d048ba67c25df42973bda370783cd58826442dcd7c",
+ "sha256:e160a7fcf25762bb60efc7e171d4497ff1d8d2d75a3d0df7a21b76821ecbf5c5"
+ ],
+ "version": "==0.6.0"
+ },
+ "py": {
+ "hashes": [
+ "sha256:29c9fab495d7528e80ba1e343b958684f4ace687327e6f789a94bf3d1915f881",
+ "sha256:983f77f3331356039fdd792e9220b7b8ee1aa6bd2b25f567a963ff1de5a64f6a"
+ ],
+ "version": "==1.5.3"
+ },
+ "pycodestyle": {
+ "hashes": [
+ "sha256:682256a5b318149ca0d2a9185d365d8864a768a28db66a84a2ea946bcc426766",
+ "sha256:6c4245ade1edfad79c3446fadfc96b0de2759662dc29d07d80a6f27ad1ca6ba9"
+ ],
+ "version": "==2.3.1"
+ },
+ "pycparser": {
+ "hashes": [
+ "sha256:99a8ca03e29851d96616ad0404b4aad7d9ee16f25c9f9708a11faf2810f7b226"
+ ],
+ "version": "==2.18"
+ },
+ "pyflakes": {
+ "hashes": [
+ "sha256:08bd6a50edf8cffa9fa09a463063c425ecaaf10d1eb0335a7e8b1401aef89e6f",
+ "sha256:8d616a382f243dbf19b54743f280b80198be0bca3a5396f1d2e1fca6223e8805"
+ ],
+ "version": "==1.6.0"
+ },
+ "pygments": {
+ "hashes": [
+ "sha256:78f3f434bcc5d6ee09020f92ba487f95ba50f1e3ef83ae96b9d5ffa1bab25c5d",
+ "sha256:dbae1046def0efb574852fab9e90209b23f556367b5a320c0bcb871c77c3e8cc"
+ ],
+ "version": "==2.2.0"
+ },
+ "pysocks": {
+ "hashes": [
+ "sha256:3fe52c55890a248676fd69dc9e3c4e811718b777834bcaab7a8125cf9deac672"
+ ],
+ "version": "==1.6.8"
+ },
+ "pytest": {
+ "hashes": [
+ "sha256:54713b26c97538db6ff0703a12b19aeaeb60b5e599de542e7fca0ec83b9038e8",
+ "sha256:829230122facf05a5f81a6d4dfe6454a04978ea3746853b2b84567ecf8e5c526"
+ ],
+ "index": "pypi",
+ "version": "==3.5.1"
+ },
+ "pytest-cov": {
+ "hashes": [
+ "sha256:03aa752cf11db41d281ea1d807d954c4eda35cfa1b21d6971966cc041bbf6e2d",
+ "sha256:890fe5565400902b0c78b5357004aab1c814115894f4f21370e2433256a3eeec"
+ ],
+ "index": "pypi",
+ "version": "==2.5.1"
+ },
+ "pytest-forked": {
+ "hashes": [
+ "sha256:e4500cd0509ec4a26535f7d4112a8cc0f17d3a41c29ffd4eab479d2a55b30805",
+ "sha256:f275cb48a73fc61a6710726348e1da6d68a978f0ec0c54ece5a5fae5977e5a08"
+ ],
+ "version": "==0.2"
+ },
+ "pytest-httpbin": {
+ "hashes": [
+ "sha256:8cd57e27418a7d7d205fcc9802eea246ed06170e3065abfa76c6d9b40553592c",
+ "sha256:d3919c5df0b644454129c0066a8ae62db40ac54bacb4cfd89d8cfa58615a4b42"
+ ],
+ "index": "pypi",
+ "version": "==0.3.0"
+ },
+ "pytest-mock": {
+ "hashes": [
+ "sha256:53801e621223d34724926a5c98bd90e8e417ce35264365d39d6c896388dcc928",
+ "sha256:d89a8209d722b8307b5e351496830d5cc5e192336003a485443ae9adeb7dd4c0"
+ ],
+ "index": "pypi",
+ "version": "==1.10.0"
+ },
+ "pytest-xdist": {
+ "hashes": [
+ "sha256:be2662264b035920ba740ed6efb1c816a83c8a22253df7766d129f6a7bfdbd35",
+ "sha256:e8f5744acc270b3e7d915bdb4d5f471670f049b6fbd163d4cbd52203b075d30f"
+ ],
+ "index": "pypi",
+ "version": "==1.22.2"
+ },
+ "pytz": {
+ "hashes": [
+ "sha256:65ae0c8101309c45772196b21b74c46b2e5d11b6275c45d251b150d5da334555",
+ "sha256:c06425302f2cf668f1bba7a0a03f3c1d34d4ebeef2c72003da308b3947c7f749"
+ ],
+ "version": "==2018.4"
+ },
+ "raven": {
+ "hashes": [
+ "sha256:1c641e5ebc2d4185560608e253970ca0d4b98475f4edf67735015a415f9e1d48",
+ "sha256:95aecf76c414facaddbb056f3e98c7936318123e467728f2e50b3a66b65a6ef7"
+ ],
+ "version": "==6.8.0"
+ },
+ "readme-renderer": {
+ "hashes": [
+ "sha256:bde909eaa84d65b7942f7e6998c8b427b90b568b2630ff0306f4ca75f6d2a909",
+ "sha256:e7e43a7ba49f08c3cb660d0f2e25b561f6b10b36c63f029060230aab2dc2875e"
+ ],
+ "index": "pypi",
+ "version": "==20.0"
+ },
+ "requests": {
+ "hashes": [
+ "sha256:6a1b267aa90cac58ac3a765d067950e7dbbf75b1da07e895d1f594193a40a38b",
+ "sha256:9c443e7324ba5b85070c4a818ade28bfabedf16ea10206da1132edaa6dda237e"
+ ],
+ "version": "==2.18.4"
+ },
+ "six": {
+ "hashes": [
+ "sha256:70e8a77beed4562e7f14fe23a786b54f6296e34344c23bc42f07b15018ff98e9",
+ "sha256:832dc0e10feb1aa2c68dcc57dbb658f1c7e65b9b61af69048abc87a2db00a0eb"
+ ],
+ "version": "==1.11.0"
+ },
+ "snowballstemmer": {
+ "hashes": [
+ "sha256:919f26a68b2c17a7634da993d91339e288964f93c274f1343e3bbbe2096e1128",
+ "sha256:9f3bcd3c401c3e862ec0ebe6d2c069ebc012ce142cce209c098ccb5b09136e89"
+ ],
+ "version": "==1.2.1"
+ },
+ "sphinx": {
+ "hashes": [
+ "sha256:11f271e7a9398385ed730e90f0bb41dc3815294bdcd395b46ed2d033bc2e7d87",
+ "sha256:4064ea6c56feeb268838cb8fbbee507d0c3d5d92fa63a7df935a916b52c9e2f5"
+ ],
+ "index": "pypi",
+ "version": "==1.5.5"
+ },
+ "tox": {
+ "hashes": [
+ "sha256:96efa09710a3daeeb845561ebbe1497641d9cef2ee0aea30db6969058b2bda2f",
+ "sha256:9ee7de958a43806402a38c0d2aa07fa8553f4d2c20a15b140e9f771c2afeade0"
+ ],
+ "index": "pypi",
+ "version": "==3.0.0"
+ },
+ "urllib3": {
+ "hashes": [
+ "sha256:06330f386d6e4b195fbfc736b297f58c5a892e4440e54d294d7004e3a9bbea1b",
+ "sha256:cc44da8e1145637334317feebd728bd869a35285b93cbb4cca2577da7e62db4f"
+ ],
+ "version": "==1.22"
+ },
+ "virtualenv": {
+ "hashes": [
+ "sha256:1d7e241b431e7afce47e77f8843a276f652699d1fa4f93b9d8ce0076fd7b0b54",
+ "sha256:e8e05d4714a1c51a2f5921e62f547fcb0f713ebbe959e0a7f585cc8bef71d11f"
+ ],
+ "version": "==15.2.0"
+ },
+ "webencodings": {
+ "hashes": [
+ "sha256:a0af1213f3c2226497a97e2b3aa01a7e4bee4f403f95be16fc9acd2947514a78",
+ "sha256:b36a1c245f2d304965eb4e0a82848379241dc04b865afcc4aab16748587e1923"
+ ],
+ "version": "==0.5.1"
+ },
+ "werkzeug": {
+ "hashes": [
+ "sha256:c3fd7a7d41976d9f44db327260e263132466836cef6f91512889ed60ad26557c",
+ "sha256:d5da73735293558eb1651ee2fddc4d0dedcfa06538b8813a2e20011583c9e49b"
+ ],
+ "version": "==0.14.1"
+ }
+ }
+}
diff --git a/README.md b/README.md
new file mode 100644
index 0000000000..ee3b2bf5e5
--- /dev/null
+++ b/README.md
@@ -0,0 +1,107 @@
+Requests: HTTP for Humansâą
+==========================
+
+[](https://pypi.org/project/requests/)
+[](https://pypi.org/project/requests/)
+[](https://pypi.org/project/requests/)
+[](https://codecov.io/github/requests/requests)
+[](https://github.com/requests/requests/graphs/contributors)
+[](https://saythanks.io/to/kennethreitz)
+
+**If you're interested in financially supporting Kenneth Reitz open source, consider [visiting this link](https://cash.me/$KennethReitz). Your support helps tremendously with sustainability of motivation, as Open Source is no longer part of my day job.**
+
+Requests is the only *Non-GMO* HTTP library for Python, safe for human
+consumption.
+
+
+
+Behold, the power of Requests:
+
+``` {.sourceCode .python}
+>>> r = requests.get('https://api.github.com/user', auth=('user', 'pass'))
+>>> r.status_code
+200
+>>> r.headers['content-type']
+'application/json; charset=utf8'
+>>> r.encoding
+'utf-8'
+>>> r.text
+u'{"type":"User"...'
+>>> r.json()
+{u'disk_usage': 368627, u'private_gists': 484, ...}
+```
+
+See [the similar code, sans Requests](https://gist.github.com/973705).
+
+[](http://docs.python-requests.org/)
+
+Requests allows you to send *organic, grass-fed* HTTP/1.1 requests,
+without the need for manual labor. There's no need to manually add query
+strings to your URLs, or to form-encode your POST data. Keep-alive and
+HTTP connection pooling are 100% automatic, thanks to
+[urllib3](https://github.com/shazow/urllib3).
+
+Besides, all the cool kids are doing it. Requests is one of the most
+downloaded Python packages of all time, pulling in over 11,000,000
+downloads every month. You don't want to be left out!
+
+Feature Support
+---------------
+
+Requests is ready for today's web.
+
+- International Domains and URLs
+- Keep-Alive & Connection Pooling
+- Sessions with Cookie Persistence
+- Browser-style SSL Verification
+- Basic/Digest Authentication
+- Elegant Key/Value Cookies
+- Automatic Decompression
+- Automatic Content Decoding
+- Unicode Response Bodies
+- Multipart File Uploads
+- HTTP(S) Proxy Support
+- Connection Timeouts
+- Streaming Downloads
+- `.netrc` Support
+- Chunked Requests
+
+Requests officially supports Python 2.7 & 3.4â3.7, and runs great on
+PyPy.
+
+Installation
+------------
+
+To install Requests, simply use [pipenv](http://pipenv.org/) (or pip, of
+course):
+
+``` {.sourceCode .bash}
+$ pipenv install requests
+âšđ°âš
+```
+
+Satisfaction guaranteed.
+
+Documentation
+-------------
+
+Fantastic documentation is available at
+, for a limited time only.
+
+How to Contribute
+-----------------
+
+1. Check for open issues or open a fresh issue to start a discussion
+ around a feature idea or a bug. There is a [Contributor
+ Friendly](https://github.com/requests/requests/issues?direction=desc&labels=Contributor+Friendly&page=1&sort=updated&state=open)
+ tag for issues that should be ideal for people who are not very
+ familiar with the codebase yet.
+2. Fork [the repository](https://github.com/requests/requests) on
+ GitHub to start making your changes to the **master** branch (or
+ branch off of it).
+3. Write a test which shows that the bug was fixed or that the feature
+ works as expected.
+4. Send a pull request and bug the maintainer until it gets merged and
+ published. :) Make sure to add yourself to
+ [AUTHORS](https://github.com/requests/requests/blob/master/AUTHORS.rst).
+
diff --git a/README.rst b/README.rst
deleted file mode 100644
index fe6cf5f65c..0000000000
--- a/README.rst
+++ /dev/null
@@ -1,114 +0,0 @@
-Requests: HTTP for Humans
-=========================
-
-.. image:: https://img.shields.io/pypi/v/requests.svg
- :target: https://pypi.python.org/pypi/requests
-
-.. image:: https://img.shields.io/pypi/l/requests.svg
- :target: https://pypi.python.org/pypi/requests
-
-.. image:: https://img.shields.io/pypi/pyversions/requests.svg
- :target: https://pypi.python.org/pypi/requests
-
-.. image:: https://codecov.io/github/requests/requests/coverage.svg?branch=master
- :target: https://codecov.io/github/requests/requests
- :alt: codecov.io
-
-.. image:: https://img.shields.io/github/contributors/requests/requests.svg
- :target: https://github.com/requests/requests/graphs/contributors
-
-.. image:: https://img.shields.io/badge/Say%20Thanks-!-1EAEDB.svg
- :target: https://saythanks.io/to/kennethreitz
-
-
-
-Requests is the only *Non-GMO* HTTP library for Python, safe for human
-consumption.
-
-**Warning:** Recreational use of the Python standard library for HTTP may result in dangerous side-effects,
-including: security vulnerabilities, verbose code, reinventing the wheel,
-constantly reading documentation, depression, headaches, or even death.
-
-Behold, the power of Requests:
-
-.. code-block:: python
-
- >>> r = requests.get('https://api.github.com/user', auth=('user', 'pass'))
- >>> r.status_code
- 200
- >>> r.headers['content-type']
- 'application/json; charset=utf8'
- >>> r.encoding
- 'utf-8'
- >>> r.text
- u'{"type":"User"...'
- >>> r.json()
- {u'disk_usage': 368627, u'private_gists': 484, ...}
-
-See `the similar code, sans Requests `_.
-
-.. image:: https://raw.githubusercontent.com/requests/requests/master/docs/_static/requests-logo-small.png
- :target: http://docs.python-requests.org/
-
-
-Requests allows you to send *organic, grass-fed* HTTP/1.1 requests, without the
-need for manual labor. There's no need to manually add query strings to your
-URLs, or to form-encode your POST data. Keep-alive and HTTP connection pooling
-are 100% automatic, thanks to `urllib3 `_.
-
-Besides, all the cool kids are doing it. Requests is one of the most
-downloaded Python packages of all time, pulling in over 11,000,000 downloads
-every month. You don't want to be left out!
-
-Feature Support
----------------
-
-Requests is ready for today's web.
-
-- International Domains and URLs
-- Keep-Alive & Connection Pooling
-- Sessions with Cookie Persistence
-- Browser-style SSL Verification
-- Basic/Digest Authentication
-- Elegant Key/Value Cookies
-- Automatic Decompression
-- Automatic Content Decoding
-- Unicode Response Bodies
-- Multipart File Uploads
-- HTTP(S) Proxy Support
-- Connection Timeouts
-- Streaming Downloads
-- ``.netrc`` Support
-- Chunked Requests
-
-Requests officially supports Python 2.6â2.7 & 3.3â3.7, and runs great on PyPy.
-
-Installation
-------------
-
-To install Requests, simply:
-
-.. code-block:: bash
-
- $ pip install requests
- âšđ°âš
-
-Satisfaction guaranteed.
-
-Documentation
--------------
-
-Fantastic documentation is available at http://docs.python-requests.org/, for a limited time only.
-
-
-How to Contribute
------------------
-
-#. Check for open issues or open a fresh issue to start a discussion around a feature idea or a bug. There is a `Contributor Friendly`_ tag for issues that should be ideal for people who are not very familiar with the codebase yet.
-#. Fork `the repository`_ on GitHub to start making your changes to the **master** branch (or branch off of it).
-#. Write a test which shows that the bug was fixed or that the feature works as expected.
-#. Send a pull request and bug the maintainer until it gets merged and published. :) Make sure to add yourself to AUTHORS_.
-
-.. _`the repository`: http://github.com/requests/requests
-.. _AUTHORS: https://github.com/requests/requests/blob/master/AUTHORS.rst
-.. _Contributor Friendly: https://github.com/requests/requests/issues?direction=desc&labels=Contributor+Friendly&page=1&sort=updated&state=open
diff --git a/_appveyor/install.ps1 b/_appveyor/install.ps1
index 94d6f01813..160ba55c07 100644
--- a/_appveyor/install.ps1
+++ b/_appveyor/install.ps1
@@ -226,4 +226,4 @@ function main () {
InstallPip $env:PYTHON
}
-main
\ No newline at end of file
+main
diff --git a/appveyor.yml b/appveyor.yml
index 4f06557e84..3b6cef631b 100644
--- a/appveyor.yml
+++ b/appveyor.yml
@@ -1,5 +1,5 @@
# AppVeyor.yml from https://github.com/ogrisel/python-appveyor-demo
-# License: CC0 1.0 Universal: http://creativecommons.org/publicdomain/zero/1.0/
+# License: CC0 1.0 Universal: https://creativecommons.org/publicdomain/zero/1.0/
build: off
@@ -25,8 +25,13 @@ environment:
PYTHON_ARCH: "64"
TOXENV: "py36"
+ - PYTHON: "C:\\Python37-x64"
+ PYTHON_VERSION: "3.7.x"
+ PYTHON_ARCH: "64"
+ TOXENV: "py37"
+
install:
- # Install Python (from the official .msi of http://python.org) and pip when
+ # Install Python (from the official .msi of https://www.python.org/) and pip when
# not already installed.
- ps: if (-not(Test-Path($env:PYTHON))) { & _appveyor\install.ps1 }
@@ -41,11 +46,12 @@ install:
# Upgrade to the latest version of pip to avoid it displaying warnings
# about it being out of date.
- - "pip install --disable-pip-version-check --user --upgrade pip"
+ - "python -m pip install --upgrade pip wheel"
- "C:\\MinGW\\bin\\mingw32-make"
+ - "pipenv install -e .[socks] --skip-lock"
test_script:
- "C:\\MinGW\\bin\\mingw32-make coverage"
on_success:
- - "codecov -f coverage.xml"
+ - "pipenv run codecov -f coverage.xml"
diff --git a/docs/_static/custom.css b/docs/_static/custom.css
index 3a8af312d7..52493c5b31 100644
--- a/docs/_static/custom.css
+++ b/docs/_static/custom.css
@@ -1,12 +1,21 @@
+body > div.document > div.sphinxsidebar > div > form > table > tbody > tr:nth-child(2) > td > select {
+ width: 100%!important;
+}
+
+#python27 > a {
+ color: white;
+}
+
+/* Carbon by BuySellAds */
#carbonads {
display: block;
overflow: hidden;
+ margin: 1.5em 0 2em;
padding: 1em;
- background-color: #eeeeee;
- text-align: center;
border: solid 1px #cccccc;
- margin: 1.5em 0 2em;
border-radius: 2px;
+ background-color: #eeeeee;
+ text-align: center;
line-height: 1.5;
}
@@ -34,6 +43,140 @@
display: block;
text-transform: uppercase;
letter-spacing: 1px;
+ font-size: 10px;
line-height: 1;
+}
+
+
+/* Native CPC by BuySellAds */
+
+.native-js {
+ visibility: hidden;
+ font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto,
+ Oxygen-Sans, Ubuntu, Cantarell, "Helvetica Neue", Helvetica, Arial,
+ sans-serif;
+ opacity: 0;
+ transition: all .25s ease-in-out;
+ transform: translateY(calc(100% - 35px));
+
+ flex-flow: column nowrap;
+}
+
+.native-js[data-state=visible] {
+ position: fixed;
+ bottom: 0;
+ left: 0;
+ right: 0;
+ visibility: visible;
+ box-shadow: 0 -1px 4px 1px hsla(0, 0%, 0%, .15);
+ opacity: 1;
+}
+
+.native-js[data-state=visible]:hover {
+ transform: translateY(0);
+}
+
+.native-img {
+ margin-right: 20px;
+ max-height: 50px;
+ border-radius: 3px;
+}
+
+.native-sponsor {
+ margin: 10px 20px;
+ text-align: center;
+ text-transform: uppercase;
+ letter-spacing: .5px;
+ font-size: 12px;
+ transition: all .3s ease-in-out;
+ transform-origin: left;
+}
+
+.native-js[data-state=visible]:hover .native-sponsor {
+ margin: 0 20px;
+ opacity: 0;
+ transform: scaleY(0);
+}
+
+.native-flex {
+ display: flex;
+ padding: 0 20px 30px;
+ text-decoration: none;
+
+ flex-flow: row nowrap;
+ align-items: center;
+}
+
+.native-flex:hover {
+ text-decoration: none;
+}
+
+.native-main {
+ display: flex;
+
+ flex-flow: row nowrap;
+ align-items: center;
+}
+
+.native-details {
+ display: flex;
+ margin-right: 30px;
+
+ flex-flow: column nowrap;
+}
+
+.native-company {
+ margin-bottom: 4px;
+ text-transform: uppercase;
+ letter-spacing: 2px;
font-size: 10px;
}
+
+.native-desc {
+ letter-spacing: 1px;
+ font-weight: 300;
+ line-height: 1.4;
+}
+
+.native-cta {
+ padding: 10px 14px;
+ border-radius: 3px;
+ box-shadow: 0 6px 13px 0 hsla(0, 0%, 0%, .15);
+ text-transform: uppercase;
+ white-space: nowrap;
+ letter-spacing: 1px;
+ font-weight: 400;
+ font-size: 12px;
+ transition: all .3s ease-in-out;
+ transform: translateY(-1px);
+}
+
+.native-cta:hover {
+ box-shadow: none;
+ transform: translateY(1px);
+}
+
+@media only screen and (min-width: 320px) and (max-width: 759px) {
+ .native-flex {
+ flex-direction: column;
+
+ flex-wrap: wrap;
+ }
+
+ .native-img {
+ margin: 0;
+ }
+
+ .native-details {
+ margin: 0;
+ }
+
+ .native-main {
+ flex-direction: column;
+ margin-bottom: 20px;
+ text-align: center;
+
+ flex-wrap: wrap;
+ align-content: center;
+ }
+}
diff --git a/docs/_static/native.js b/docs/_static/native.js
new file mode 100644
index 0000000000..65aebecf20
--- /dev/null
+++ b/docs/_static/native.js
@@ -0,0 +1,131 @@
+var _native = (function () {
+ var _options = {}
+ var _construct = function (e) {
+ var defaultOptions = {
+ carbonZoneKey: '',
+ fallback: '',
+ ignore: 'false',
+ placement: '',
+ prefix: 'native',
+ targetClass: 'native-ad'
+ }
+
+ if (typeof e === 'undefined') return defaultOptions
+ Object.keys(defaultOptions).forEach((key, index) => {
+ if (typeof e[key] === 'undefined') {
+ e[key] = defaultOptions[key]
+ }
+ })
+ return e
+ }
+
+ var init = function (zone, options) {
+ _options = _construct(options)
+
+ let jsonUrl = `https://srv.buysellads.com/ads/${zone}.json?callback=_native_go`
+ if (_options['placement'] !== '') {
+ jsonUrl += '&segment=placement:' + _options['placement']
+ }
+ if (_options['ignore'] === 'true') {
+ jsonUrl += '&ignore=yes'
+ }
+
+ let srv = document.createElement('script')
+ srv.src = jsonUrl
+ document.getElementsByTagName('head')[0].appendChild(srv)
+ }
+
+ var carbon = function (e) {
+ let srv = document.createElement('script')
+ srv.src = '//cdn.carbonads.com/carbon.js?serve=' + e['carbonZoneKey'] + '&placement=' + e['placement']
+ srv.id = '_carbonads_js'
+
+ return srv
+ }
+
+ var sanitize = function (ads) {
+ return ads
+ .filter(ad => {
+ return Object.keys(ad).length > 0
+ })
+ .filter(ad => {
+ return ad.hasOwnProperty('statlink')
+ })
+ }
+
+ var pixel = function (p, timestamp) {
+ let c = ''
+ if (p) {
+ p.split('||').forEach((pixel, index) => {
+ c += ``
+ })
+ }
+ return c
+ }
+
+ var options = function () {
+ return _options
+ }
+
+ return {
+ carbon: carbon,
+ init: init,
+ options: options,
+ pixel: pixel,
+ sanitize: sanitize
+ }
+})({})
+
+var _native_go = function (json) {
+ let options = _native.options()
+ let ads = _native.sanitize(json['ads'])
+ let selectedClass = document.querySelectorAll('.' + options['targetClass'])
+
+ if (ads.length < 1) {
+ selectedClass.forEach((className, index) => {
+ let selectedTarget = document.getElementsByClassName(options['targetClass'])[index]
+
+ if (options['fallback'] !== '' || options['carbonZoneKey'] !== '') selectedTarget.setAttribute('data-state', 'visible')
+ selectedTarget.innerHTML = options['fallback']
+ if (options['carbonZoneKey'] !== '') selectedTarget.appendChild(_native.carbon(options))
+ })
+
+ // End at this line if no ads are found, avoiding unnecessary steps
+ return
+ }
+
+ selectedClass.forEach((className, index) => {
+ let selectedTarget = document.getElementsByClassName(options['targetClass'])[index]
+ let adElement = selectedTarget.innerHTML
+ let prefix = options['prefix']
+ let ad = ads[index]
+
+ if (ad && className) {
+ let adInnerHtml = adElement
+ .replace(new RegExp('#' + prefix + '_bg_color#', 'g'), ad['backgroundColor'])
+ .replace(new RegExp('#' + prefix + '_bg_color_hover#', 'g'), ad['backgroundHoverColor'])
+ .replace(new RegExp('#' + prefix + '_company#', 'g'), ad['company'])
+ .replace(new RegExp('#' + prefix + '_cta#', 'g'), ad['callToAction'])
+ .replace(new RegExp('#' + prefix + '_cta_bg_color#', 'g'), ad['ctaBackgroundColor'])
+ .replace(new RegExp('#' + prefix + '_cta_bg_color_hover#', 'g'), ad['ctaBackgroundHoverColor'])
+ .replace(new RegExp('#' + prefix + '_cta_color#', 'g'), ad['ctaTextColor'])
+ .replace(new RegExp('#' + prefix + '_cta_color_hover#', 'g'), ad['ctaTextColorHover'])
+ .replace(new RegExp('#' + prefix + '_desc#', 'g'), ad['description'])
+ .replace(new RegExp('#' + prefix + '_index#', 'g'), prefix + '-' + ad['i'])
+ .replace(new RegExp('#' + prefix + '_img#', 'g'), ad['image'])
+ .replace(new RegExp('#' + prefix + '_small_img#', 'g'), ad['smallImage'])
+ .replace(new RegExp('#' + prefix + '_link#', 'g'), ad['statlink'])
+ .replace(new RegExp('#' + prefix + '_logo#', 'g'), ad['logo'])
+ .replace(new RegExp('#' + prefix + '_color#', 'g'), ad['textColor'])
+ .replace(new RegExp('#' + prefix + '_color_hover#', 'g'), ad['textColorHover'])
+ .replace(new RegExp('#' + prefix + '_title#', 'g'), ad['title'])
+
+ selectedTarget.innerHTML = null
+ selectedTarget.innerHTML += adInnerHtml + _native.pixel(ad['pixel'], ad['timestamp'])
+ selectedTarget.setAttribute('data-state', 'visible')
+ } else {
+ selectedTarget.innerHTML = null
+ selectedTarget.style.display = 'none'
+ }
+ })
+}
diff --git a/docs/_templates/hacks.html b/docs/_templates/hacks.html
index f9fc96cbd0..61252a5cca 100644
--- a/docs/_templates/hacks.html
+++ b/docs/_templates/hacks.html
@@ -26,6 +26,7 @@
+
+
@@ -51,4 +53,51 @@
var easter_egg = new Konami('http://fortunes.herokuapp.com/random/raw');
-
\ No newline at end of file
+
+
+
+
+
+
+
+
Requests is an elegant and simple HTTP library for Python, built for
human beings. You are currently looking at the documentation of the
development release.
-
diff --git a/docs/api.rst b/docs/api.rst
index ed61bb3869..93cc4f0d20 100644
--- a/docs/api.rst
+++ b/docs/api.rst
@@ -109,17 +109,7 @@ Status Code Lookup
.. autoclass:: requests.codes
-::
-
- >>> requests.codes['temporary_redirect']
- 307
-
- >>> requests.codes.teapot
- 418
-
- >>> requests.codes['\o/']
- 200
-
+.. automodule:: requests.status_codes
Migrating to 1.x
@@ -149,7 +139,7 @@ API Changes
s = requests.Session() # formerly, session took parameters
s.auth = auth
s.headers.update(headers)
- r = s.get('http://httpbin.org/headers')
+ r = s.get('https://httpbin.org/headers')
* All request hooks have been removed except 'response'.
@@ -191,11 +181,11 @@ API Changes
logging.basicConfig() # you need to initialize logging, otherwise you will not see anything from requests
logging.getLogger().setLevel(logging.DEBUG)
- requests_log = logging.getLogger("requests.packages.urllib3")
+ requests_log = logging.getLogger("urllib3")
requests_log.setLevel(logging.DEBUG)
requests_log.propagate = True
- requests.get('http://httpbin.org/headers')
+ requests.get('https://httpbin.org/headers')
@@ -207,8 +197,8 @@ license from the ISC_ license to the `Apache 2.0`_ license. The Apache 2.0
license ensures that contributions to Requests are also covered by the Apache
2.0 license.
-.. _ISC: http://opensource.org/licenses/ISC
-.. _Apache 2.0: http://opensource.org/licenses/Apache-2.0
+.. _ISC: https://opensource.org/licenses/ISC
+.. _Apache 2.0: https://opensource.org/licenses/Apache-2.0
Migrating to 2.x
@@ -223,7 +213,7 @@ For more details on the changes in this release including new APIs, links
to the relevant GitHub issues and some of the bug fixes, read Cory's blog_
on the subject.
-.. _blog: http://lukasa.co.uk/2013/09/Requests_20/
+.. _blog: https://lukasa.co.uk/2013/09/Requests_20/
API Changes
diff --git a/docs/community/faq.rst b/docs/community/faq.rst
index 7087229265..945096dc64 100644
--- a/docs/community/faq.rst
+++ b/docs/community/faq.rst
@@ -3,6 +3,8 @@
Frequently Asked Questions
==========================
+.. image:: https://farm5.staticflickr.com/4290/35294660055_42c02b2316_k_d.jpg
+
This part of the documentation answers common questions about Requests.
Encoded Data?
@@ -54,11 +56,11 @@ Python 3 Support?
Yes! Here's a list of Python platforms that are officially
supported:
-* Python 2.6
* Python 2.7
* Python 3.4
* Python 3.5
* Python 3.6
+* Python 3.7
* PyPy
What are "hostname doesn't match" errors?
@@ -68,7 +70,7 @@ These errors occur when :ref:`SSL certificate verification `
fails to match the certificate the server responds with to the hostname
Requests thinks it's contacting. If you're certain the server's SSL setup is
correct (for example, because you can visit the site with your browser) and
-you're using Python 2.6 or 2.7, a possible explanation is that you need
+you're using Python 2.7, a possible explanation is that you need
Server-Name-Indication.
`Server-Name-Indication`_, or SNI, is an official extension to SSL where the
diff --git a/docs/community/out-there.rst b/docs/community/out-there.rst
index 645c0ac4fe..79b21c6da2 100644
--- a/docs/community/out-there.rst
+++ b/docs/community/out-there.rst
@@ -1,6 +1,8 @@
Integrations
============
+.. image:: https://farm5.staticflickr.com/4239/34450900674_15863ddea0_k_d.jpg
+
Python for iOS
--------------
@@ -13,10 +15,10 @@ To give it a try, simply::
Articles & Talks
================
-- `Python for the Web `_ teaches how to use Python to interact with the web, using Requests.
-- `Daniel Greenfeld's Review of Requests `_
-- `My 'Python for Humans' talk `_ ( `audio `_ )
-- `Issac Kelly's 'Consuming Web APIs' talk `_
-- `Blog post about Requests via Yum `_
-- `Russian blog post introducing Requests `_
+- `Python for the Web `_ teaches how to use Python to interact with the web, using Requests.
+- `Daniel Greenfeld's Review of Requests `_
+- `My 'Python for Humans' talk `_ ( `audio `_ )
+- `Issac Kelly's 'Consuming Web APIs' talk `_
+- `Blog post about Requests via Yum `_
+- `Russian blog post introducing Requests `_
- `Sending JSON in Requests `_
diff --git a/docs/community/recommended.rst b/docs/community/recommended.rst
index ae2ae5eb43..8fcd47a436 100644
--- a/docs/community/recommended.rst
+++ b/docs/community/recommended.rst
@@ -3,6 +3,8 @@
Recommended Packages and Extensions
===================================
+.. image:: https://farm5.staticflickr.com/4218/35224319272_cfc0e621fb_k_d.jpg
+
Requests has a great variety of powerful and useful third-party extensions.
This page provides an overview of some of the best of them.
@@ -13,7 +15,7 @@ Certifi CA Bundle
validating the trustworthiness of SSL certificates while verifying the
identity of TLS hosts. It has been extracted from the Requests project.
-.. _Certifi: http://certifi.io/en/latest/
+.. _Certifi: https://github.com/certifi/python-certifi
CacheControl
------------
@@ -32,7 +34,15 @@ but do not belong in Requests proper. This library is actively maintained
by members of the Requests core team, and reflects the functionality most
requested by users within the community.
-.. _Requests-Toolbelt: http://toolbelt.readthedocs.io/en/latest/index.html
+.. _Requests-Toolbelt: https://toolbelt.readthedocs.io/en/latest/index.html
+
+
+Requests-Threads
+----------------
+
+`Requests-Threads` is a Requests session that returns the amazing Twisted's awaitable Deferreds instead of Response objects. This allows the use of ``async``/``await`` keyword usage on Python 3, or Twisted's style of programming, if desired.
+
+.. _Requests-Threads: https://github.com/requests/requests-threads
Requests-OAuthlib
-----------------
@@ -52,6 +62,3 @@ Betamax
A VCR imitation designed only for Python-Requests.
.. _betamax: https://github.com/sigmavirus24/betamax
-
-
-
diff --git a/docs/community/release-process.rst b/docs/community/release-process.rst
index c0943ddce6..18f71168a5 100644
--- a/docs/community/release-process.rst
+++ b/docs/community/release-process.rst
@@ -1,6 +1,8 @@
Release Process and Rules
=========================
+.. image:: https://farm5.staticflickr.com/4215/34450901614_b74ae720db_k_d.jpg
+
.. versionadded:: v2.6.2
Starting with the version to be released after ``v2.6.2``, the following rules
@@ -17,19 +19,18 @@ Breaking changes are changes that break backwards compatibility with prior
versions. If the project were to change the ``text`` attribute on a
``Response`` object to a method, that would only happen in a Major release.
-Major releases may also include miscellaneous bug fixes and upgrades to
-vendored packages. The core developers of Requests are committed to providing
-a good user experience. This means we're also committed to preserving
-backwards compatibility as much as possible. Major releases will be infrequent
-and will need strong justifications before they are considered.
+Major releases may also include miscellaneous bug fixes. The core developers of
+Requests are committed to providing a good user experience. This means we're
+also committed to preserving backwards compatibility as much as possible. Major
+releases will be infrequent and will need strong justifications before they are
+considered.
Minor Releases
--------------
-A minor release will not include breaking changes but may include
-miscellaneous bug fixes and upgrades to vendored packages. If the previous
-version of Requests released was ``v10.2.7`` a minor release would be
-versioned as ``v10.3.0``.
+A minor release will not include breaking changes but may include miscellaneous
+bug fixes. If the previous version of Requests released was ``v10.2.7`` a minor
+release would be versioned as ``v10.3.0``.
Minor releases will be backwards compatible with releases that have the same
major version number. In other words, all versions that would start with
diff --git a/docs/community/sponsors.rst b/docs/community/sponsors.rst
new file mode 100644
index 0000000000..f1e11efdd0
--- /dev/null
+++ b/docs/community/sponsors.rst
@@ -0,0 +1,96 @@
+Community Sponsors
+==================
+
+**tl;dr**: Requests development is currently `funded by the Python community `_, and
+some wonderful organizations that utilize the software in their businesses.
+
+
+-------------------
+
+
+Requests is one of the most heavilyâutilized Python packages in the world.
+
+It is used by major corporations worldwide for all tasks, both small and large â from writing oneâoff scripts to orchestrating millions of dollars of critical infrastructure.
+
+It's even embedded within pip, that tool that you use to install packages and deploy with every day!
+
+After losing our primary open source maintainer (who was sponsored by a company to work on Requests, and other projects, fullâtime), we are seeking community financial contributions towards the development of Requests 3.0.
+
+Patron Sponsors
+----------------
+
+
+`Linode â SSD Cloud Hosting & Linux Servers `_
+//////////////////////////////////////////////////////////////////////
+
+Whether youâre just getting started or deploying a complex system, launching a Linode cloud server has never been easier. They offer the fastest hardware and network in the industry with scalable environments, and their 24x7 customer support team is always standing by to help with any questions.
+
+âšđ°âš
+//////
+
+----------------------------------
+
+This slot is reserved for ethical organizations willing to invest $10,000 or more in Requests per year.
+
+By becoming a patronâlevel sponsor, your organization will receive the following benefits:
+
+- Prominent placement on the Requests documentation sidebar (~11,000 uniques / day).
+- Honorable mention here, with logo.
+- Peace of mind knowing that the infrastructure you rely on is being actively maintained.
+
+Organizations that sign up will be listed in order â first come first serve!
+
+Major Sponsors
+--------------
+
+The following organizations have significantly contributed towards Requests' sustainability:
+
+`Slack â Bring your team together `_
+///////////////////////////////////////////////////////
+
+Slack was extremely kind to be the first organization to generously donate a large sum towards the `2018 Requests 3.0 fundraiser `_, surpassing our entire fundraising goal immediately! They are helping the world become a better place through connectiveness, and reducing the amount of email we all have
+to deal with on a daily basis.
+
+P.S. They're `hiring `_!
+
+
+`Twilio â Voice, SMS, and Video for Humans `_
+/////////////////////////////////////////////////////////////////////
+
+Twilio was the second organization to generously donate a large sum towards the `2018 Requests 3.0 fundraiser `_, matching the donation of Slack! They are helping the world become a better place through interconnectivity,
+providing easyâtoâuse APIs, and empowering developers world-over to help humans communicate in meaningful and effictive ways.
+
+
+`Azure Cloud Developer Advocates `_
+/////////////////////////////////////////////////////////////////////////////////////
+
+Azure was the third organization to generously donate a large sum towards the `2018 Requests 3.0 fundraiser `_, matching the donation of Twilio! Awesome group of generous folks :)
+
+
+`Niteo â Web Systems Development `_
+/////////////////////////////////////////////////////////////
+
+Niteo was the fourth company to generously donate towards the `2018 Requests 3.0 fundraiser `_. Niteo is a company employing tech enthusiasts from all over the world
+who love to build great stuff.
+
+
+`Heroku `_
+/////////////////////////////////////
+
+Heroku has allowed Kenneth Reitz to work on some open source projects during work hours,
+including Requests (but mostly Pipenv), from timeâtoâtime, so they are listed
+here as an honorable mention.
+
+----------------
+
+If your organization is interested in becoming either a sponsor or a patron, please `send us an email `_.
+
+
+Individual Sponsors
+-------------------
+
+Countless individuals, too many to list here, have individually contributed towards the sustainability of the Requests
+project over the years. Some, financially, others, with code. Contributions (from humans) of all kinds are greatly
+appreciated.
+
+âšđ°âš
\ No newline at end of file
diff --git a/docs/community/support.rst b/docs/community/support.rst
index 4d5b0c35fc..43c2801c22 100644
--- a/docs/community/support.rst
+++ b/docs/community/support.rst
@@ -3,20 +3,22 @@
Support
=======
+.. image:: https://farm5.staticflickr.com/4198/34080352913_5c13ffb336_k_d.jpg
+
If you have questions or issues about Requests, there are several options:
-StackOverflow
+Stack Overflow
-------------
If your question does not contain sensitive (possibly proprietary)
information or can be properly anonymized, please ask a question on
-`StackOverflow `_
+`Stack Overflow `_
and use the tag ``python-requests``.
Send a Tweet
------------
-If your question is less than 140 characters, feel free to send a tweet to
+If your question is less than 280 characters, feel free to send a tweet to
`@kennethreitz `_,
`@sigmavirus24 `_, or
`@lukasaoz `_.
diff --git a/docs/community/updates.rst b/docs/community/updates.rst
index e1f0d8db56..3b9a30970e 100644
--- a/docs/community/updates.rst
+++ b/docs/community/updates.rst
@@ -4,6 +4,8 @@
Community Updates
=================
+.. image:: https://farm5.staticflickr.com/4244/34080354873_516c283ad0_k_d.jpg
+
If you'd like to stay up to date on the community and development of Requests,
there are several options:
@@ -27,4 +29,3 @@ Release and Version History
===========================
.. include:: ../../HISTORY.rst
-
diff --git a/docs/community/vulnerabilities.rst b/docs/community/vulnerabilities.rst
index 5ff3f2ccad..7e299e36b9 100644
--- a/docs/community/vulnerabilities.rst
+++ b/docs/community/vulnerabilities.rst
@@ -1,6 +1,8 @@
Vulnerability Disclosure
========================
+.. image:: https://farm5.staticflickr.com/4211/34709353644_b041e9e1c2_k_d.jpg
+
If you think you have found a potential security vulnerability in requests,
please email `sigmavirus24 `_ and
`Lukasa `_ directly. **Do not file a public issue.**
@@ -95,11 +97,11 @@ Previous CVEs
- Fixed in 2.6.0
- - `CVE 2015-2296 `_,
+ - `CVE 2015-2296 `_,
reported by Matthew Daley of `BugFuzz `_.
- Fixed in 2.3.0
- - `CVE 2014-1829 `_
+ - `CVE 2014-1829 `_
- - `CVE 2014-1830 `_
+ - `CVE 2014-1830 `_
diff --git a/docs/conf.py b/docs/conf.py
index b1c941e141..fb10abdbaf 100644
--- a/docs/conf.py
+++ b/docs/conf.py
@@ -58,7 +58,7 @@
# General information about the project.
project = u'Requests'
-copyright = u'MMXVII. A Kenneth Reitz Project'
+copyright = u'MMXVIII. A Kenneth Reitz Project'
author = u'Kenneth Reitz'
# The version info for the project you're documenting, acts as replacement for
@@ -129,7 +129,8 @@
'github_user': 'requests',
'github_repo': 'requests',
'github_banner': True,
- 'show_related': False
+ 'show_related': False,
+ 'note_bg': '#FFF59C'
}
# Add any paths that contain custom themes here, relative to this directory.
@@ -375,4 +376,7 @@
# If false, no index is generated.
#epub_use_index = True
-intersphinx_mapping = {'urllib3': ('http://urllib3.readthedocs.io/en/latest', None)}
+intersphinx_mapping = {
+ 'python': ('https://docs.python.org/3/', None),
+ 'urllib3': ('https://urllib3.readthedocs.io/en/latest', None),
+}
diff --git a/docs/dev/authors.rst b/docs/dev/authors.rst
index 2e96919c64..4cdd14cd43 100644
--- a/docs/dev/authors.rst
+++ b/docs/dev/authors.rst
@@ -1,5 +1,6 @@
Authors
=======
+.. image:: https://static1.squarespace.com/static/533ad9bde4b098d084a846b1/t/534f6e1ce4b09b70f38ee6c1/1432265542589/DSCF3147.jpg?format=2500w
.. include:: ../../AUTHORS.rst
diff --git a/docs/dev/contributing.rst b/docs/dev/contributing.rst
index 265994b324..45e5cfa104 100644
--- a/docs/dev/contributing.rst
+++ b/docs/dev/contributing.rst
@@ -3,6 +3,8 @@
Contributor's Guide
===================
+.. image:: https://farm5.staticflickr.com/4237/35550408335_7671fde302_k_d.jpg
+
If you're reading this, you're probably interested in contributing to Requests.
Thank you very much! Open source projects live-and-die based on the support
they receive from others, and the fact that you're even considering
@@ -39,7 +41,7 @@ including reporting bugs or requesting features. This golden rule is
**All contributions are welcome**, as long as
everyone involved is treated with respect.
-.. _be cordial or be on your way: http://kennethreitz.org/be-cordial-or-be-on-your-way/
+.. _be cordial or be on your way: https://www.kennethreitz.org/essays/be-cordial-or-be-on-your-way
.. _early-feedback:
@@ -155,7 +157,7 @@ model methods (e.g. ``__repr__``) are typically the exception to this rule.
Thanks for helping to make the world a better place!
-.. _PEP 8: http://pep8.org
+.. _PEP 8: https://pep8.org/
.. _line continuations: https://www.python.org/dev/peps/pep-0008/#indentation
Documentation Contributions
@@ -203,4 +205,4 @@ while keeping an open ear and mind.
If you believe there is a feature missing, feel free to raise a feature
request, but please do be aware that the overwhelming likelihood is that your
-feature request will not be accepted.
\ No newline at end of file
+feature request will not be accepted.
diff --git a/docs/dev/philosophy.rst b/docs/dev/philosophy.rst
index c0c0612fd7..c9d8713595 100644
--- a/docs/dev/philosophy.rst
+++ b/docs/dev/philosophy.rst
@@ -1,13 +1,15 @@
Development Philosophy
======================
+.. image:: https://farm5.staticflickr.com/4231/34484831073_636008a23d_k_d.jpg
+
Requests is an open but opinionated library, created by an open but opinionated developer.
Management Style
~~~~~~~~~~~~~~~~
-`Kenneth Reitz `_ is the BDFL. He has final say in any decision related to the Requests project. Kenneth is responsible for the direction and form of the library, as well as its presentation. In addition to making decisions based on technical merit, he is responsible for making decisions based on the development philosophy of Requests.
+`Kenneth Reitz `_ is the BDFL. He has final say in any decision related to the Requests project. Kenneth is responsible for the direction and form of the library, as well as its presentation. In addition to making decisions based on technical merit, he is responsible for making decisions based on the development philosophy of Requests.
`Ian Cordasco `_ and `Cory Benfield `_ are the core contributors. They are responsible for triaging bug reports, reviewing pull requests and ensuring that Kenneth is kept up to speed with developments around the library. The day-to-day managing of the project is done by the core contributors. They are responsible for making judgements about whether or not a feature request is likely to be accepted by Kenneth. Their word is, in some ways, more final than Kenneth's.
@@ -24,13 +26,17 @@ Semantic Versioning
For many years, the open source community has been plagued with version number dystonia. Numbers vary so greatly from project to project, they are practically meaningless.
-Requests uses `Semantic Versioning `_. This specification seeks to put an end to this madness with a small set of practical guidelines for you and your colleagues to use in your next project.
+Requests uses `Semantic Versioning `_. This specification seeks to put an end to this madness with a small set of practical guidelines for you and your colleagues to use in your next project.
Standard Library?
~~~~~~~~~~~~~~~~~
Requests has no *active* plans to be included in the standard library. This decision has been discussed at length with Guido as well as numerous core developers.
+.. raw:: html
+
+
+
Essentially, the standard library is where a library goes to die. It is appropriate for a module to be included when active development is no longer necessary.
Linux Distro Packages
diff --git a/docs/dev/todo.rst b/docs/dev/todo.rst
index 75ec30850d..26cd9b716b 100644
--- a/docs/dev/todo.rst
+++ b/docs/dev/todo.rst
@@ -1,6 +1,8 @@
How to Help
===========
+.. image:: https://farm5.staticflickr.com/4290/34450900104_bc1d424213_k_d.jpg
+
Requests is under active development, and contributions are more than welcome!
#. Check for open issues or open a fresh issue to start a discussion around a bug.
@@ -49,15 +51,14 @@ Runtime Environments
Requests currently supports the following versions of Python:
-- Python 2.6
- Python 2.7
- Python 3.4
- Python 3.5
- Python 3.6
+- Python 3.7
- PyPy
Google AppEngine is not officially supported although support is available
with the `Requests-Toolbelt`_.
-.. _Requests-Toolbelt: http://toolbelt.readthedocs.io/
-
+.. _Requests-Toolbelt: https://toolbelt.readthedocs.io/
diff --git a/docs/index.rst b/docs/index.rst
index d4778ae0a2..9e946dae09 100644
--- a/docs/index.rst
+++ b/docs/index.rst
@@ -3,19 +3,19 @@
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
-Requests: HTTP for Humans
-=========================
+Requests: HTTP for Humansâą
+==========================
Release v\ |version|. (:ref:`Installation `)
.. image:: https://img.shields.io/pypi/l/requests.svg
- :target: https://pypi.python.org/pypi/requests
+ :target: https://pypi.org/project/requests/
.. image:: https://img.shields.io/pypi/wheel/requests.svg
- :target: https://pypi.python.org/pypi/requests
+ :target: https://pypi.org/project/requests/
.. image:: https://img.shields.io/pypi/pyversions/requests.svg
- :target: https://pypi.python.org/pypi/requests
+ :target: https://pypi.org/project/requests/
.. image:: https://codecov.io/github/requests/requests/coverage.svg?branch=master
:target: https://codecov.io/github/requests/requests
@@ -28,15 +28,17 @@ Release v\ |version|. (:ref:`Installation `)
**Requests** is the only *Non-GMO* HTTP library for Python, safe for human
consumption.
-*Warning: Recreational use of the Python standard library for HTTP may result in dangerous side-effects,
-including: security vulnerabilities, verbose code, reinventing the wheel,
-constantly reading documentation, depression, headaches, or even death.*
+.. note:: The use of **Python 3** is *highly* preferred over Python 2. Consider upgrading your applications and infrastructure if you find yourself *still* using Python 2 in production today. If you are using Python 3, congratulations â you are indeed a person of excellent taste.
+ â*Kenneth Reitz*
+
+
+If you're interested in financially supporting Kenneth Reitz open source, consider visiting `this link `_. Your support helps tremendously with sustainability of motivation, as Open Source is no longer part of my day job.
+
-------------------
**Behold, the power of Requests**::
- >>> import requests
>>> r = requests.get('https://api.github.com/user', auth=('user', 'pass'))
>>> r.status_code
200
@@ -60,12 +62,12 @@ are 100% automatic, thanks to `urllib3 `_.
User Testimonials
-----------------
-Twitter, Spotify, Microsoft, Amazon, Lyft, BuzzFeed, Reddit, The NSA, Her Majesty's Government, Google, Twilio, Runscope, Mozilla, Heroku,
+Nike, Twitter, Spotify, Microsoft, Amazon, Lyft, BuzzFeed, Reddit, The NSA, Her Majesty's Government, Google, Twilio, Runscope, Mozilla, Heroku,
PayPal, NPR, Obama for America, Transifex, Native Instruments, The Washington
Post, SoundCloud, Kippt, Sony, and Federal U.S.
Institutions that prefer to be unnamed claim to use Requests internally.
-**Armin Ronacher**â
+**Armin Ronacher**, creator of Flaskâ
*Requests is the perfect example how beautiful an API can be with the
right level of abstraction.*
@@ -75,14 +77,18 @@ Institutions that prefer to be unnamed claim to use Requests internally.
**Daniel Greenfeld**â
*Nuked a 1200 LOC spaghetti code library with 10 lines of code thanks to
- Kenneth Reitz's request library. Today has been AWESOME.*
+ Kenneth Reitz's Requests library. Today has been AWESOME.*
**Kenny Meyers**â
*Python HTTP: When in doubt, or when not in doubt, use Requests. Beautiful,
simple, Pythonic.*
Requests is one of the most downloaded Python packages of all time, pulling in
-over 11,000,000 downloads every month. All the cool kids are doing it!
+over 400,000 downloads **each day**. Join the party!
+
+If your organization uses Requests internally, consider `supporting the development of 3.0 `_. Your
+generosity will be greatly appreciated, and help drive the project forward
+into the future.
Beloved Features
----------------
@@ -105,7 +111,7 @@ Requests is ready for today's web.
- Chunked Requests
- ``.netrc`` Support
-Requests officially supports Python 2.6â2.7 & 3.4â3.7, and runs great on PyPy.
+Requests officially supports Python 2.7 & 3.4â3.7, and runs great on PyPy.
The User Guide
@@ -132,10 +138,11 @@ This part of the documentation, which is mostly prose, details the
Requests ecosystem and community.
.. toctree::
- :maxdepth: 1
+ :maxdepth: 2
- community/faq
+ community/sponsors
community/recommended
+ community/faq
community/out-there
community/support
community/vulnerabilities
diff --git a/docs/user/advanced.rst b/docs/user/advanced.rst
index 4aa1dfac13..9a615aae73 100644
--- a/docs/user/advanced.rst
+++ b/docs/user/advanced.rst
@@ -3,6 +3,8 @@
Advanced Usage
==============
+.. image:: https://farm5.staticflickr.com/4263/35163665790_d182d84f5e_k_d.jpg
+
This document covers some of Requests more advanced features.
.. _session-objects:
@@ -23,8 +25,8 @@ Let's persist some cookies across requests::
s = requests.Session()
- s.get('http://httpbin.org/cookies/set/sessioncookie/123456789')
- r = s.get('http://httpbin.org/cookies')
+ s.get('https://httpbin.org/cookies/set/sessioncookie/123456789')
+ r = s.get('https://httpbin.org/cookies')
print(r.text)
# '{"cookies": {"sessioncookie": "123456789"}}'
@@ -38,7 +40,7 @@ is done by providing data to the properties on a Session object::
s.headers.update({'x-test': 'true'})
# both 'x-test' and 'x-test2' are sent
- s.get('http://httpbin.org/headers', headers={'x-test2': 'true'})
+ s.get('https://httpbin.org/headers', headers={'x-test2': 'true'})
Any dictionaries that you pass to a request method will be merged with the
@@ -51,11 +53,11 @@ with the first request, but not the second::
s = requests.Session()
- r = s.get('http://httpbin.org/cookies', cookies={'from-my': 'browser'})
+ r = s.get('https://httpbin.org/cookies', cookies={'from-my': 'browser'})
print(r.text)
# '{"cookies": {"from-my": "browser"}}'
- r = s.get('http://httpbin.org/cookies')
+ r = s.get('https://httpbin.org/cookies')
print(r.text)
# '{"cookies": {}}'
@@ -67,7 +69,7 @@ If you want to manually add cookies to your session, use the
Sessions can also be used as context managers::
with requests.Session() as s:
- s.get('http://httpbin.org/cookies/set/sessioncookie/123456789')
+ s.get('https://httpbin.org/cookies/set/sessioncookie/123456789')
This will make sure the session is closed as soon as the ``with`` block is
exited, even if unhandled exceptions occurred.
@@ -95,7 +97,7 @@ The ``Response`` object contains all of the information returned by the server a
also contains the ``Request`` object you created originally. Here is a simple
request to get some very important information from Wikipedia's servers::
- >>> r = requests.get('http://en.wikipedia.org/wiki/Monty_Python')
+ >>> r = requests.get('https://en.wikipedia.org/wiki/Monty_Python')
If we want to access the headers the server sent back to us, we do this::
@@ -187,6 +189,25 @@ applied, replace the call to :meth:`Request.prepare()
print(resp.status_code)
+When you are using the prepared request flow, keep in mind that it does not take into account the environment.
+This can cause problems if you are using environment variables to change the behaviour of requests.
+For example: Self-signed SSL certificates specified in ``REQUESTS_CA_BUNDLE`` will not be taken into account.
+As a result an ``SSL: CERTIFICATE_VERIFY_FAILED`` is thrown.
+You can get around this behaviour by explicity merging the environment settings into your session::
+
+ from requests import Request, Session
+
+ s = Session()
+ req = Request('GET', url)
+
+ prepped = s.prepare_request(req)
+
+ # Merge environment settings into session
+ settings = s.merge_environment_settings(prepped.url, None, None, None, None)
+ resp = s.send(prepped, **settings)
+
+ print(resp.status_code)
+
.. _verification:
SSL Cert Verification
@@ -253,21 +274,20 @@ If you specify a wrong path or an invalid cert, you'll get a SSLError::
CA Certificates
---------------
-By default, Requests bundles a set of root CAs that it trusts, sourced from the
-`Mozilla trust store`_. However, these are only updated once for each Requests
-version. This means that if you pin a Requests version your certificates can
-become extremely out of date.
+Requests uses certificates from the package `certifi`_. This allows for users
+to update their trusted certificates without changing the version of Requests.
-From Requests version 2.4.0 onwards, Requests will attempt to use certificates
-from `certifi`_ if it is present on the system. This allows for users to update
-their trusted certificates without having to change the code that runs on their
-system.
+Before version 2.16, Requests bundled a set of root CAs that it trusted,
+sourced from the `Mozilla trust store`_. The certificates were only updated
+once for each Requests version. When ``certifi`` was not installed, this led to
+extremely out-of-date certificate bundles when using significantly older
+versions of Requests.
For the sake of security we recommend upgrading certifi frequently!
.. _HTTP persistent connection: https://en.wikipedia.org/wiki/HTTP_persistent_connection
-.. _connection pooling: http://urllib3.readthedocs.io/en/latest/reference/index.html#module-urllib3.connectionpool
-.. _certifi: http://certifi.io/
+.. _connection pooling: https://urllib3.readthedocs.io/en/latest/reference/index.html#module-urllib3.connectionpool
+.. _certifi: https://certifiio.readthedocs.io/
.. _Mozilla trust store: https://hg.mozilla.org/mozilla-central/raw-file/tip/security/nss/lib/ckfw/builtins/certdata.txt
.. _body-content-workflow:
@@ -303,7 +323,7 @@ inefficiency with connections. If you find yourself partially reading request
bodies (or not reading them at all) while using ``stream=True``, you should
make the request within a ``with`` statement to ensure it's always closed::
- with requests.get('http://httpbin.org/get', stream=True) as r:
+ with requests.get('https://httpbin.org/get', stream=True) as r:
# Do things with the response here.
.. _keep-alive:
@@ -331,13 +351,11 @@ file-like object for your body::
with open('massive-body', 'rb') as f:
requests.post('http://some.url/streamed', data=f)
-.. warning:: It is strongly recommended that you open files in `binary mode`_.
- This is because Requests may attempt to provide the
- ``Content-Length`` header for you, and if it does this value will
- be set to the number of *bytes* in the file. Errors may occur if
- you open the file in *text mode*.
-
-.. _binary mode: https://docs.python.org/2/tutorial/inputoutput.html#reading-and-writing-files
+.. warning:: It is strongly recommended that you open files in :ref:`binary
+ mode `. This is because Requests may attempt to provide
+ the ``Content-Length`` header for you, and if it does this value
+ will be set to the number of *bytes* in the file. Errors may occur
+ if you open the file in *text mode*.
.. _chunk-encoding:
@@ -375,7 +393,7 @@ upload image files to an HTML form with a multiple file field 'images'::
To do that, just set files to a list of tuples of ``(form_field_name, file_info)``::
- >>> url = 'http://httpbin.org/post'
+ >>> url = 'https://httpbin.org/post'
>>> multiple_files = [
('images', ('foo.png', open('foo.png', 'rb'), 'image/png')),
('images', ('bar.png', open('bar.png', 'rb'), 'image/png'))]
@@ -388,13 +406,11 @@ To do that, just set files to a list of tuples of ``(form_field_name, file_info)
...
}
-.. warning:: It is strongly recommended that you open files in `binary mode`_.
- This is because Requests may attempt to provide the
- ``Content-Length`` header for you, and if it does this value will
- be set to the number of *bytes* in the file. Errors may occur if
- you open the file in *text mode*.
-
-.. _binary mode: https://docs.python.org/2/tutorial/inputoutput.html#reading-and-writing-files
+.. warning:: It is strongly recommended that you open files in :ref:`binary
+ mode `. This is because Requests may attempt to provide
+ the ``Content-Length`` header for you, and if it does this value
+ will be set to the number of *bytes* in the file. Errors may occur
+ if you open the file in *text mode*.
.. _event-hooks:
@@ -415,7 +431,7 @@ You can assign a hook function on a per-request basis by passing a
``{hook_name: callback_function}`` dictionary to the ``hooks`` request
parameter::
- hooks=dict(response=print_url)
+ hooks={'response': print_url}
That ``callback_function`` will receive a chunk of data as its first
argument.
@@ -429,14 +445,38 @@ If an error occurs while executing your callback, a warning is given.
If the callback function returns a value, it is assumed that it is to
replace the data that was passed in. If the function doesn't return
-anything, nothing else is effected.
+anything, nothing else is affected.
+
+::
+
+ def record_hook(r, *args, **kwargs):
+ r.hook_called = True
+ return r
Let's print some request method arguments at runtime::
- >>> requests.get('http://httpbin.org', hooks=dict(response=print_url))
- http://httpbin.org
+ >>> requests.get('https://httpbin.org/', hooks={'response': print_url})
+ https://httpbin.org/
+You can add multiple hooks to a single request. Let's call two hooks at once::
+
+ >>> r = requests.get('https://httpbin.org/', hooks={'response': [print_url, record_hook]})
+ >>> r.hook_called
+ True
+
+You can also add hooks to a ``Session`` instance. Any hooks you add will then
+be called on every request made to the session. For example::
+
+ >>> s = requests.Session()
+ >>> s.hooks['response'].append(print_url)
+ >>> s.get('https://httpbin.org/')
+ https://httpbin.org/
+
+
+A ``Session`` can have multiple hooks, which will be called in the order
+they are added.
+
.. _custom-auth:
Custom Authentication
@@ -489,7 +529,7 @@ set ``stream`` to ``True`` and iterate over the response with
import json
import requests
- r = requests.get('http://httpbin.org/stream/20', stream=True)
+ r = requests.get('https://httpbin.org/stream/20', stream=True)
for line in r.iter_lines():
@@ -503,7 +543,7 @@ When using `decode_unicode=True` with
:meth:`Response.iter_content() `, you'll want
to provide a fallback encoding in the event the server doesn't provide one::
- r = requests.get('http://httpbin.org/stream/20', stream=True)
+ r = requests.get('https://httpbin.org/stream/20', stream=True)
if r.encoding is None:
r.encoding = 'utf-8'
@@ -612,12 +652,12 @@ When you receive a response, Requests makes a guess at the encoding to
use for decoding the response when you access the :attr:`Response.text
` attribute. Requests will first check for an
encoding in the HTTP header, and if none is present, will use `chardet
-`_ to attempt to guess the encoding.
+`_ to attempt to guess the encoding.
The only time Requests will not do this is if no explicit charset
is present in the HTTP headers **and** the ``Content-Type``
header contains ``text``. In this situation, `RFC 2616
-`_ specifies
+`_ specifies
that the default charset must be ``ISO-8859-1``. Requests follows the
specification in this case. If you require a different encoding, you can
manually set the :attr:`Response.encoding `
@@ -839,7 +879,7 @@ Link Headers
Many HTTP APIs feature Link headers. They make APIs more self describing and
discoverable.
-GitHub uses these for `pagination `_
+GitHub uses these for `pagination `_
in their API, for example::
>>> url = 'https://api.github.com/users/kennethreitz/repos?page=1&per_page=10'
@@ -881,7 +921,7 @@ it should apply to.
::
>>> s = requests.Session()
- >>> s.mount('http://www.github.com', MyAdapter())
+ >>> s.mount('https://github.com/', MyAdapter())
The mount call registers a specific instance of a Transport Adapter to a
prefix. Once mounted, any HTTP request made using that session whose URL starts
@@ -906,9 +946,9 @@ passed-through to `urllib3`. We'll make a Transport Adapter that instructs the
library to use SSLv3::
import ssl
+ from urllib3.poolmanager import PoolManager
from requests.adapters import HTTPAdapter
- from requests.packages.urllib3.poolmanager import PoolManager
class Ssl3HttpAdapter(HTTPAdapter):
@@ -919,7 +959,7 @@ library to use SSLv3::
num_pools=connections, maxsize=maxsize,
block=block, ssl_version=ssl.PROTOCOL_SSLv3)
-.. _`described here`: http://www.kennethreitz.org/essays/the-future-of-python-http
+.. _`described here`: https://www.kennethreitz.org/essays/the-future-of-python-http
.. _`urllib3`: https://github.com/shazow/urllib3
.. _blocking-or-nonblocking:
@@ -936,8 +976,9 @@ response at a time. However, these calls will still block.
If you are concerned about the use of blocking IO, there are lots of projects
out there that combine Requests with one of Python's asynchronicity frameworks.
-Two excellent examples are `grequests`_ and `requests-futures`_.
+Some excellent examples are `requests-threads`_, `grequests`_, and `requests-futures`_.
+.. _`requests-threads`: https://github.com/requests/requests-threads
.. _`grequests`: https://github.com/kennethreitz/grequests
.. _`requests-futures`: https://github.com/ross/requests-futures
@@ -962,7 +1003,7 @@ The **connect** timeout is the number of seconds Requests will wait for your
client to establish a connection to a remote machine (corresponding to the
`connect()`_) call on the socket. It's a good practice to set connect timeouts
to slightly larger than a multiple of 3, which is the default `TCP packet
-retransmission window `_.
+retransmission window `_.
Once your client has connected to the server and sent the HTTP request, the
**read** timeout is the number of seconds the client will wait for the server
@@ -987,4 +1028,4 @@ coffee.
r = requests.get('https://github.com', timeout=None)
-.. _`connect()`: http://linux.die.net/man/2/connect
+.. _`connect()`: https://linux.die.net/man/2/connect
diff --git a/docs/user/authentication.rst b/docs/user/authentication.rst
index 2b7a8690c1..bff809861e 100644
--- a/docs/user/authentication.rst
+++ b/docs/user/authentication.rst
@@ -3,6 +3,8 @@
Authentication
==============
+.. image:: https://farm5.staticflickr.com/4258/35550409215_3b08d49d22_k_d.jpg
+
This document discusses using various kinds of authentication with Requests.
Many web services require authentication, and there are many different types.
@@ -51,7 +53,7 @@ Another very popular form of HTTP Authentication is Digest Authentication,
and Requests supports this out of the box as well::
>>> from requests.auth import HTTPDigestAuth
- >>> url = 'http://httpbin.org/digest-auth/auth/user/pass'
+ >>> url = 'https://httpbin.org/digest-auth/auth/user/pass'
>>> requests.get(url, auth=HTTPDigestAuth('user', 'pass'))
@@ -120,7 +122,7 @@ To do so, subclass :class:`AuthBase ` and implement the
... # Implement my authentication
... return r
...
- >>> url = 'http://httpbin.org/get'
+ >>> url = 'https://httpbin.org/get'
>>> requests.get(url, auth=MyAuth())
@@ -132,13 +134,13 @@ authentication will additionally add hooks to provide further functionality.
Further examples can be found under the `Requests organization`_ and in the
``auth.py`` file.
-.. _OAuth: http://oauth.net/
+.. _OAuth: https://oauth.net/
.. _requests_oauthlib: https://github.com/requests/requests-oauthlib
-.. _requests-oauthlib OAuth2 documentation: http://requests-oauthlib.readthedocs.io/en/latest/oauth2_workflow.html
-.. _Web Application Flow: http://requests-oauthlib.readthedocs.io/en/latest/oauth2_workflow.html#web-application-flow
-.. _Mobile Application Flow: http://requests-oauthlib.readthedocs.io/en/latest/oauth2_workflow.html#mobile-application-flow
-.. _Legacy Application Flow: http://requests-oauthlib.readthedocs.io/en/latest/oauth2_workflow.html#legacy-application-flow
-.. _Backend Application Flow: http://requests-oauthlib.readthedocs.io/en/latest/oauth2_workflow.html#backend-application-flow
+.. _requests-oauthlib OAuth2 documentation: https://requests-oauthlib.readthedocs.io/en/latest/oauth2_workflow.html
+.. _Web Application Flow: https://requests-oauthlib.readthedocs.io/en/latest/oauth2_workflow.html#web-application-flow
+.. _Mobile Application Flow: https://requests-oauthlib.readthedocs.io/en/latest/oauth2_workflow.html#mobile-application-flow
+.. _Legacy Application Flow: https://requests-oauthlib.readthedocs.io/en/latest/oauth2_workflow.html#legacy-application-flow
+.. _Backend Application Flow: https://requests-oauthlib.readthedocs.io/en/latest/oauth2_workflow.html#backend-application-flow
.. _Kerberos: https://github.com/requests/requests-kerberos
.. _NTLM: https://github.com/requests/requests-ntlm
.. _Requests organization: https://github.com/requests
diff --git a/docs/user/install.rst b/docs/user/install.rst
index 43ed789b45..3888876ab2 100644
--- a/docs/user/install.rst
+++ b/docs/user/install.rst
@@ -3,19 +3,22 @@
Installation of Requests
========================
+.. image:: https://farm5.staticflickr.com/4230/35550376215_da1bf77a8c_k_d.jpg
+
+
This part of the documentation covers the installation of Requests.
The first step to using any software package is getting it properly installed.
-$ pip install requests
-----------------------
+$ pipenv install requests
+-------------------------
To install Requests, simply run this simple command in your terminal of choice::
- $ pip install requests
+ $ pipenv install requests
-If you don't have `pip `_ installed (tisk tisk!),
-`this Python installation guide `_
+If you don't have `pipenv `_ installed (tisk tisk!), head over to the Pipenv website for installation instructions. Or, if you prefer to just use pip and don't have it installed,
+`this Python installation guide `_
can guide you through the process.
Get the Source Code
diff --git a/docs/user/intro.rst b/docs/user/intro.rst
index 816ad0a42a..be9dfaa8b3 100644
--- a/docs/user/intro.rst
+++ b/docs/user/intro.rst
@@ -3,6 +3,8 @@
Introduction
============
+.. image:: https://farm5.staticflickr.com/4317/35198386374_1939af3de6_k_d.jpg
+
Philosophy
----------
@@ -35,8 +37,8 @@ closed-source software.
Requests is released under terms of `Apache2 License`_.
-.. _`GPL Licensed`: http://www.opensource.org/licenses/gpl-license.php
-.. _`Apache2 License`: http://opensource.org/licenses/Apache-2.0
+.. _`GPL Licensed`: https://opensource.org/licenses/gpl-license.php
+.. _`Apache2 License`: https://opensource.org/licenses/Apache-2.0
Requests License
diff --git a/docs/user/quickstart.rst b/docs/user/quickstart.rst
index 71382d5331..1a75b5ce01 100644
--- a/docs/user/quickstart.rst
+++ b/docs/user/quickstart.rst
@@ -3,6 +3,8 @@
Quickstart
==========
+.. image:: https://farm5.staticflickr.com/4259/35163667010_8bfcaef274_k_d.jpg
+
.. module:: requests.models
Eager to get started? This page gives a good introduction in how to get started
@@ -37,15 +39,15 @@ get all the information we need from this object.
Requests' simple API means that all forms of HTTP request are as obvious. For
example, this is how you make an HTTP POST request::
- >>> r = requests.post('http://httpbin.org/post', data = {'key':'value'})
+ >>> r = requests.post('https://httpbin.org/post', data = {'key':'value'})
Nice, right? What about the other HTTP request types: PUT, DELETE, HEAD and
OPTIONS? These are all just as simple::
- >>> r = requests.put('http://httpbin.org/put', data = {'key':'value'})
- >>> r = requests.delete('http://httpbin.org/delete')
- >>> r = requests.head('http://httpbin.org/get')
- >>> r = requests.options('http://httpbin.org/get')
+ >>> r = requests.put('https://httpbin.org/put', data = {'key':'value'})
+ >>> r = requests.delete('https://httpbin.org/delete')
+ >>> r = requests.head('https://httpbin.org/get')
+ >>> r = requests.options('https://httpbin.org/get')
That's all well and good, but it's also only the start of what Requests can
do.
@@ -63,12 +65,12 @@ using the ``params`` keyword argument. As an example, if you wanted to pass
following code::
>>> payload = {'key1': 'value1', 'key2': 'value2'}
- >>> r = requests.get('http://httpbin.org/get', params=payload)
+ >>> r = requests.get('https://httpbin.org/get', params=payload)
You can see that the URL has been correctly encoded by printing the URL::
>>> print(r.url)
- http://httpbin.org/get?key2=value2&key1=value1
+ https://httpbin.org/get?key2=value2&key1=value1
Note that any dictionary key whose value is ``None`` will not be added to the
URL's query string.
@@ -77,9 +79,9 @@ You can also pass a list of items as a value::
>>> payload = {'key1': 'value1', 'key2': ['value2', 'value3']}
- >>> r = requests.get('http://httpbin.org/get', params=payload)
+ >>> r = requests.get('https://httpbin.org/get', params=payload)
>>> print(r.url)
- http://httpbin.org/get?key1=value1&key2=value2&key2=value3
+ https://httpbin.org/get?key1=value1&key2=value2&key2=value3
Response Content
----------------
@@ -108,7 +110,7 @@ using, and change it, using the ``r.encoding`` property::
If you change the encoding, Requests will use the new value of ``r.encoding``
whenever you call ``r.text``. You might want to do this in any situation where
you can apply special logic to work out what the encoding of the content will
-be. For example, HTTP and XML have the ability to specify their encoding in
+be. For example, HTML and XML have the ability to specify their encoding in
their body. In situations like this, you should use ``r.content`` to find the
encoding, and then set ``r.encoding``. This will let you use ``r.text`` with
the correct encoding.
@@ -169,7 +171,7 @@ server, you can access ``r.raw``. If you want to do this, make sure you set
>>> r = requests.get('https://api.github.com/events', stream=True)
>>> r.raw
-
+
>>> r.raw.read(10)
'\x1f\x8b\x08\x00\x00\x00\x00\x00\x00\x03'
@@ -187,6 +189,14 @@ download, the above is the preferred and recommended way to retrieve the
content. Note that ``chunk_size`` can be freely adjusted to a number that
may better fit your use cases.
+.. note::
+
+ An important note about using ``Response.iter_content`` versus ``Response.raw``.
+ ``Response.iter_content`` will automatically decode the ``gzip`` and ``deflate``
+ transfer-encodings. ``Response.raw`` is a raw stream of bytes -- it does not
+ transform the response content. If you really need access to the bytes as they
+ were returned, use ``Response.raw``.
+
Custom Headers
--------------
@@ -223,7 +233,7 @@ dictionary of data will automatically be form-encoded when the request is made::
>>> payload = {'key1': 'value1', 'key2': 'value2'}
- >>> r = requests.post("http://httpbin.org/post", data=payload)
+ >>> r = requests.post("https://httpbin.org/post", data=payload)
>>> print(r.text)
{
...
@@ -234,12 +244,16 @@ dictionary of data will automatically be form-encoded when the request is made::
...
}
-You can also pass a list of tuples to the ``data`` argument. This is particularly
-useful when the form has multiple elements that use the same key::
+The ``data`` argument can also have multiple values for each key. This can be
+done by making ``data`` either a list of tuples or a dictionary with lists
+as values. This is particularly useful when the form has multiple elements that
+use the same key::
- >>> payload = (('key1', 'value1'), ('key1', 'value2'))
- >>> r = requests.post('http://httpbin.org/post', data=payload)
- >>> print(r.text)
+ >>> payload_tuples = [('key1', 'value1'), ('key1', 'value2')]
+ >>> r1 = requests.post('https://httpbin.org/post', data=payload_tuples)
+ >>> payload_dict = {'key1': ['value1', 'value2']}
+ >>> r2 = requests.post('https://httpbin.org/post', data=payload_dict)
+ >>> print(r1.text)
{
...
"form": {
@@ -250,6 +264,8 @@ useful when the form has multiple elements that use the same key::
},
...
}
+ >>> r1.text == r2.text
+ True
There are times that you may want to send data that is not form-encoded. If
you pass in a ``string`` instead of a ``dict``, that data will be posted directly.
@@ -271,13 +287,16 @@ the ``json`` parameter (added in version 2.4.2) and it will be encoded automatic
>>> r = requests.post(url, json=payload)
+Note, the ``json`` parameter is ignored if either ``data`` or ``files`` is passed.
+
+Using the ``json`` parameter in the request will change the ``Content-Type`` in the header to ``application/json``.
POST a Multipart-Encoded File
-----------------------------
Requests makes it simple to upload Multipart-encoded files::
- >>> url = 'http://httpbin.org/post'
+ >>> url = 'https://httpbin.org/post'
>>> files = {'file': open('report.xls', 'rb')}
>>> r = requests.post(url, files=files)
@@ -292,7 +311,7 @@ Requests makes it simple to upload Multipart-encoded files::
You can set the filename, content_type and headers explicitly::
- >>> url = 'http://httpbin.org/post'
+ >>> url = 'https://httpbin.org/post'
>>> files = {'file': ('report.xls', open('report.xls', 'rb'), 'application/vnd.ms-excel', {'Expires': '0'})}
>>> r = requests.post(url, files=files)
@@ -307,7 +326,7 @@ You can set the filename, content_type and headers explicitly::
If you want, you can send strings to be received as files::
- >>> url = 'http://httpbin.org/post'
+ >>> url = 'https://httpbin.org/post'
>>> files = {'file': ('report.csv', 'some,data,to,send\nanother,row,to,send\n')}
>>> r = requests.post(url, files=files)
@@ -329,13 +348,11 @@ support this, but there is a separate package which does -
For sending multiple files in one request refer to the :ref:`advanced `
section.
-.. warning:: It is strongly recommended that you open files in `binary mode`_.
- This is because Requests may attempt to provide the
- ``Content-Length`` header for you, and if it does this value will
- be set to the number of *bytes* in the file. Errors may occur if
- you open the file in *text mode*.
-
-.. _binary mode: https://docs.python.org/2/tutorial/inputoutput.html#reading-and-writing-files
+.. warning:: It is strongly recommended that you open files in :ref:`binary
+ mode `. This is because Requests may attempt to provide
+ the ``Content-Length`` header for you, and if it does this value
+ will be set to the number of *bytes* in the file. Errors may occur
+ if you open the file in *text mode*.
Response Status Codes
@@ -343,7 +360,7 @@ Response Status Codes
We can check the response status code::
- >>> r = requests.get('http://httpbin.org/get')
+ >>> r = requests.get('https://httpbin.org/get')
>>> r.status_code
200
@@ -357,7 +374,7 @@ If we made a bad request (a 4XX client error or 5XX server error response), we
can raise it with
:meth:`Response.raise_for_status() `::
- >>> bad_r = requests.get('http://httpbin.org/status/404')
+ >>> bad_r = requests.get('https://httpbin.org/status/404')
>>> bad_r.status_code
404
@@ -393,7 +410,7 @@ We can view the server's response headers using a Python dictionary::
}
The dictionary is special, though: it's made just for HTTP headers. According to
-`RFC 7230 `_, HTTP Header names
+`RFC 7230 `_, HTTP Header names
are case-insensitive.
So, we can access the headers using any capitalization we want::
@@ -407,7 +424,7 @@ So, we can access the headers using any capitalization we want::
It is also special in that the server could have sent the same header multiple
times with different values, but requests combines them so they can be
represented in the dictionary within a single mapping, as per
-`RFC 7230 `_:
+`RFC 7230 `_:
A recipient MAY combine multiple header fields with the same field name
into one "field-name: field-value" pair, without changing the semantics
@@ -428,7 +445,7 @@ If a response contains some Cookies, you can quickly access them::
To send your own cookies to the server, you can use the ``cookies``
parameter::
- >>> url = 'http://httpbin.org/cookies'
+ >>> url = 'https://httpbin.org/cookies'
>>> cookies = dict(cookies_are='working')
>>> r = requests.get(url, cookies=cookies)
@@ -443,7 +460,7 @@ also be passed in to requests::
>>> jar = requests.cookies.RequestsCookieJar()
>>> jar.set('tasty_cookie', 'yum', domain='httpbin.org', path='/cookies')
>>> jar.set('gross_cookie', 'blech', domain='httpbin.org', path='/elsewhere')
- >>> url = 'http://httpbin.org/cookies'
+ >>> url = 'https://httpbin.org/cookies'
>>> r = requests.get(url, cookies=jar)
>>> r.text
'{"cookies": {"tasty_cookie": "yum"}}'
@@ -464,7 +481,7 @@ response.
For example, GitHub redirects all HTTP requests to HTTPS::
- >>> r = requests.get('http://github.com')
+ >>> r = requests.get('https://github.com/')
>>> r.url
'https://github.com/'
@@ -479,7 +496,7 @@ For example, GitHub redirects all HTTP requests to HTTPS::
If you're using GET, OPTIONS, POST, PUT, PATCH or DELETE, you can disable
redirection handling with the ``allow_redirects`` parameter::
- >>> r = requests.get('http://github.com', allow_redirects=False)
+ >>> r = requests.get('https://github.com/', allow_redirects=False)
>>> r.status_code
301
@@ -489,7 +506,7 @@ redirection handling with the ``allow_redirects`` parameter::
If you're using HEAD, you can enable redirection as well::
- >>> r = requests.head('http://github.com', allow_redirects=True)
+ >>> r = requests.head('https://github.com/', allow_redirects=True)
>>> r.url
'https://github.com/'
@@ -506,7 +523,7 @@ seconds with the ``timeout`` parameter. Nearly all production code should use
this parameter in nearly all requests. Failure to do so can cause your program
to hang indefinitely::
- >>> requests.get('http://github.com', timeout=0.001)
+ >>> requests.get('https://github.com/', timeout=0.001)
Traceback (most recent call last):
File "", line 1, in
requests.exceptions.Timeout: HTTPConnectionPool(host='github.com', port=80): Request timed out. (timeout=0.001)
diff --git a/requests/__init__.py b/requests/__init__.py
index 268e7dcc53..bc168ee533 100644
--- a/requests/__init__.py
+++ b/requests/__init__.py
@@ -22,7 +22,7 @@
... or POST:
>>> payload = dict(key1='value1', key2='value2')
- >>> r = requests.post('http://httpbin.org/post', data=payload)
+ >>> r = requests.post('https://httpbin.org/post', data=payload)
>>> print(r.text)
{
...
@@ -57,10 +57,10 @@ def check_compatibility(urllib3_version, chardet_version):
# Check urllib3 for compatibility.
major, minor, patch = urllib3_version # noqa: F811
major, minor, patch = int(major), int(minor), int(patch)
- # urllib3 >= 1.21.1, <= 1.22
+ # urllib3 >= 1.21.1, <= 1.24
assert major == 1
assert minor >= 21
- assert minor <= 22
+ assert minor <= 24
# Check chardet for compatibility.
major, minor, patch = chardet_version.split('.')[:3]
@@ -71,11 +71,22 @@ def check_compatibility(urllib3_version, chardet_version):
assert patch >= 2
+def _check_cryptography(cryptography_version):
+ # cryptography < 1.3.4
+ try:
+ cryptography_version = list(map(int, cryptography_version.split('.')))
+ except ValueError:
+ return
+
+ if cryptography_version < [1, 3, 4]:
+ warning = 'Old version of cryptography ({}) may cause slowdown.'.format(cryptography_version)
+ warnings.warn(warning, RequestsDependencyWarning)
+
# Check imported dependencies for compatibility.
try:
check_compatibility(urllib3.__version__, chardet.__version__)
except (AssertionError, ValueError):
- warnings.warn("urllib3 ({0}) or chardet ({1}) doesn't match a supported "
+ warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
"version!".format(urllib3.__version__, chardet.__version__),
RequestsDependencyWarning)
@@ -83,6 +94,10 @@ def check_compatibility(urllib3_version, chardet_version):
try:
from urllib3.contrib import pyopenssl
pyopenssl.inject_into_urllib3()
+
+ # Check cryptography version
+ from cryptography import __version__ as cryptography_version
+ _check_cryptography(cryptography_version)
except ImportError:
pass
@@ -108,12 +123,7 @@ def check_compatibility(urllib3_version, chardet_version):
# Set default logging handler to avoid "No handler found" warnings.
import logging
-try: # Python 2.7+
- from logging import NullHandler
-except ImportError:
- class NullHandler(logging.Handler):
- def emit(self, record):
- pass
+from logging import NullHandler
logging.getLogger(__name__).addHandler(NullHandler())
diff --git a/requests/__version__.py b/requests/__version__.py
index dc33eef651..be8a45fe0e 100644
--- a/requests/__version__.py
+++ b/requests/__version__.py
@@ -5,10 +5,10 @@
__title__ = 'requests'
__description__ = 'Python HTTP for Humans.'
__url__ = 'http://python-requests.org'
-__version__ = '2.18.4'
-__build__ = 0x021804
+__version__ = '2.20.0'
+__build__ = 0x022000
__author__ = 'Kenneth Reitz'
__author_email__ = 'me@kennethreitz.org'
__license__ = 'Apache 2.0'
-__copyright__ = 'Copyright 2017 Kenneth Reitz'
+__copyright__ = 'Copyright 2018 Kenneth Reitz'
__cake__ = u'\u2728 \U0001f370 \u2728'
diff --git a/requests/adapters.py b/requests/adapters.py
index 00f8792b69..fa4d9b3cc9 100644
--- a/requests/adapters.py
+++ b/requests/adapters.py
@@ -13,6 +13,7 @@
from urllib3.poolmanager import PoolManager, proxy_from_url
from urllib3.response import HTTPResponse
+from urllib3.util import parse_url
from urllib3.util import Timeout as TimeoutSauce
from urllib3.util.retry import Retry
from urllib3.exceptions import ClosedPoolError
@@ -25,16 +26,18 @@
from urllib3.exceptions import ReadTimeoutError
from urllib3.exceptions import SSLError as _SSLError
from urllib3.exceptions import ResponseError
+from urllib3.exceptions import LocationValueError
from .models import Response
from .compat import urlparse, basestring
-from .utils import (DEFAULT_CA_BUNDLE_PATH, get_encoding_from_headers,
- prepend_scheme_if_needed, get_auth_from_url, urldefragauth,
- select_proxy)
+from .utils import (DEFAULT_CA_BUNDLE_PATH, extract_zipped_paths,
+ get_encoding_from_headers, prepend_scheme_if_needed,
+ get_auth_from_url, urldefragauth, select_proxy)
from .structures import CaseInsensitiveDict
from .cookies import extract_cookies_to_jar
from .exceptions import (ConnectionError, ConnectTimeout, ReadTimeout, SSLError,
- ProxyError, RetryError, InvalidSchema)
+ ProxyError, RetryError, InvalidSchema, InvalidProxyURL,
+ InvalidURL)
from .auth import _basic_auth_str
try:
@@ -126,8 +129,7 @@ def __init__(self, pool_connections=DEFAULT_POOLSIZE,
self.init_poolmanager(pool_connections, pool_maxsize, block=pool_block)
def __getstate__(self):
- return dict((attr, getattr(self, attr, None)) for attr in
- self.__attrs__)
+ return {attr: getattr(self, attr, None) for attr in self.__attrs__}
def __setstate__(self, state):
# Can't handle by adding 'proxy_manager' to self.__attrs__ because
@@ -219,11 +221,11 @@ def cert_verify(self, conn, url, verify, cert):
cert_loc = verify
if not cert_loc:
- cert_loc = DEFAULT_CA_BUNDLE_PATH
+ cert_loc = extract_zipped_paths(DEFAULT_CA_BUNDLE_PATH)
if not cert_loc or not os.path.exists(cert_loc):
raise IOError("Could not find a suitable TLS CA certificate bundle, "
- "invalid path: {0}".format(cert_loc))
+ "invalid path: {}".format(cert_loc))
conn.cert_reqs = 'CERT_REQUIRED'
@@ -245,10 +247,10 @@ def cert_verify(self, conn, url, verify, cert):
conn.key_file = None
if conn.cert_file and not os.path.exists(conn.cert_file):
raise IOError("Could not find the TLS certificate file, "
- "invalid path: {0}".format(conn.cert_file))
+ "invalid path: {}".format(conn.cert_file))
if conn.key_file and not os.path.exists(conn.key_file):
raise IOError("Could not find the TLS key file, "
- "invalid path: {0}".format(conn.key_file))
+ "invalid path: {}".format(conn.key_file))
def build_response(self, req, resp):
"""Builds a :class:`Response ` object from a urllib3
@@ -300,6 +302,10 @@ def get_connection(self, url, proxies=None):
if proxy:
proxy = prepend_scheme_if_needed(proxy, 'http')
+ proxy_url = parse_url(proxy)
+ if not proxy_url.host:
+ raise InvalidProxyURL("Please check proxy URL. It is malformed"
+ " and could be missing the host.")
proxy_manager = self.proxy_manager_for(proxy)
conn = proxy_manager.connection_from_url(url)
else:
@@ -373,7 +379,7 @@ def proxy_headers(self, proxy):
when subclassing the
:class:`HTTPAdapter `.
- :param proxies: The url of the proxy being used for this request.
+ :param proxy: The url of the proxy being used for this request.
:rtype: dict
"""
headers = {}
@@ -402,11 +408,14 @@ def send(self, request, stream=False, timeout=None, verify=True, cert=None, prox
:rtype: requests.Response
"""
- conn = self.get_connection(request.url, proxies)
+ try:
+ conn = self.get_connection(request.url, proxies)
+ except LocationValueError as e:
+ raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
- self.add_headers(request)
+ self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
@@ -416,7 +425,7 @@ def send(self, request, stream=False, timeout=None, verify=True, cert=None, prox
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
- err = ("Invalid timeout {0}. Pass a (connect, read) "
+ err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
@@ -466,11 +475,10 @@ def send(self, request, stream=False, timeout=None, verify=True, cert=None, prox
# Receive the response from the server
try:
- # For Python 2.7+ versions, use buffering of HTTP
- # responses
+ # For Python 2.7, use buffering of HTTP responses
r = low_conn.getresponse(buffering=True)
except TypeError:
- # For compatibility with Python 2.6 versions and back
+ # For compatibility with Python 3.3+
r = low_conn.getresponse()
resp = HTTPResponse.from_httplib(
diff --git a/requests/api.py b/requests/api.py
index bc2115c150..abada96d46 100644
--- a/requests/api.py
+++ b/requests/api.py
@@ -18,9 +18,11 @@ def request(method, url, **kwargs):
:param method: method for the new :class:`Request` object.
:param url: URL for the new :class:`Request` object.
- :param params: (optional) Dictionary or bytes to be sent in the query string for the :class:`Request`.
- :param data: (optional) Dictionary or list of tuples ``[(key, value)]`` (will be form-encoded), bytes, or file-like object to send in the body of the :class:`Request`.
- :param json: (optional) json data to send in the body of the :class:`Request`.
+ :param params: (optional) Dictionary, list of tuples or bytes to send
+ in the body of the :class:`Request`.
+ :param data: (optional) Dictionary, list of tuples, bytes, or file-like
+ object to send in the body of the :class:`Request`.
+ :param json: (optional) A JSON serializable Python object to send in the body of the :class:`Request`.
:param headers: (optional) Dictionary of HTTP Headers to send with the :class:`Request`.
:param cookies: (optional) Dict or CookieJar object to send with the :class:`Request`.
:param files: (optional) Dictionary of ``'name': file-like-objects`` (or ``{'name': file-tuple}``) for multipart encoding upload.
@@ -47,7 +49,7 @@ def request(method, url, **kwargs):
Usage::
>>> import requests
- >>> req = requests.request('GET', 'http://httpbin.org/get')
+ >>> req = requests.request('GET', 'https://httpbin.org/get')
"""
@@ -62,7 +64,8 @@ def get(url, params=None, **kwargs):
r"""Sends a GET request.
:param url: URL for the new :class:`Request` object.
- :param params: (optional) Dictionary or bytes to be sent in the query string for the :class:`Request`.
+ :param params: (optional) Dictionary, list of tuples or bytes to send
+ in the body of the :class:`Request`.
:param \*\*kwargs: Optional arguments that ``request`` takes.
:return: :class:`Response ` object
:rtype: requests.Response
@@ -102,7 +105,8 @@ def post(url, data=None, json=None, **kwargs):
r"""Sends a POST request.
:param url: URL for the new :class:`Request` object.
- :param data: (optional) Dictionary (will be form-encoded), bytes, or file-like object to send in the body of the :class:`Request`.
+ :param data: (optional) Dictionary, list of tuples, bytes, or file-like
+ object to send in the body of the :class:`Request`.
:param json: (optional) json data to send in the body of the :class:`Request`.
:param \*\*kwargs: Optional arguments that ``request`` takes.
:return: :class:`Response ` object
@@ -116,7 +120,8 @@ def put(url, data=None, **kwargs):
r"""Sends a PUT request.
:param url: URL for the new :class:`Request` object.
- :param data: (optional) Dictionary (will be form-encoded), bytes, or file-like object to send in the body of the :class:`Request`.
+ :param data: (optional) Dictionary, list of tuples, bytes, or file-like
+ object to send in the body of the :class:`Request`.
:param json: (optional) json data to send in the body of the :class:`Request`.
:param \*\*kwargs: Optional arguments that ``request`` takes.
:return: :class:`Response ` object
@@ -130,7 +135,8 @@ def patch(url, data=None, **kwargs):
r"""Sends a PATCH request.
:param url: URL for the new :class:`Request` object.
- :param data: (optional) Dictionary (will be form-encoded), bytes, or file-like object to send in the body of the :class:`Request`.
+ :param data: (optional) Dictionary, list of tuples, bytes, or file-like
+ object to send in the body of the :class:`Request`.
:param json: (optional) json data to send in the body of the :class:`Request`.
:param \*\*kwargs: Optional arguments that ``request`` takes.
:return: :class:`Response ` object
diff --git a/requests/auth.py b/requests/auth.py
index 1a182dffdd..bdde51c7fd 100644
--- a/requests/auth.py
+++ b/requests/auth.py
@@ -38,7 +38,7 @@ def _basic_auth_str(username, password):
if not isinstance(username, basestring):
warnings.warn(
"Non-string usernames will no longer be supported in Requests "
- "3.0.0. Please convert the object you've passed in ({0!r}) to "
+ "3.0.0. Please convert the object you've passed in ({!r}) to "
"a string or bytes object in the near future to avoid "
"problems.".format(username),
category=DeprecationWarning,
@@ -48,7 +48,7 @@ def _basic_auth_str(username, password):
if not isinstance(password, basestring):
warnings.warn(
"Non-string passwords will no longer be supported in Requests "
- "3.0.0. Please convert the object you've passed in ({0!r}) to "
+ "3.0.0. Please convert the object you've passed in ({!r}) to "
"a string or bytes object in the near future to avoid "
"problems.".format(password),
category=DeprecationWarning,
@@ -153,6 +153,18 @@ def sha_utf8(x):
x = x.encode('utf-8')
return hashlib.sha1(x).hexdigest()
hash_utf8 = sha_utf8
+ elif _algorithm == 'SHA-256':
+ def sha256_utf8(x):
+ if isinstance(x, str):
+ x = x.encode('utf-8')
+ return hashlib.sha256(x).hexdigest()
+ hash_utf8 = sha256_utf8
+ elif _algorithm == 'SHA-512':
+ def sha512_utf8(x):
+ if isinstance(x, str):
+ x = x.encode('utf-8')
+ return hashlib.sha512(x).hexdigest()
+ hash_utf8 = sha512_utf8
KD = lambda s, d: hash_utf8("%s:%s" % (s, d))
diff --git a/requests/compat.py b/requests/compat.py
index f417cfd8d0..c44b35efb9 100644
--- a/requests/compat.py
+++ b/requests/compat.py
@@ -43,8 +43,8 @@
import cookielib
from Cookie import Morsel
from StringIO import StringIO
+ from collections import Callable, Mapping, MutableMapping, OrderedDict
- from urllib3.packages.ordered_dict import OrderedDict
builtin_str = str
bytes = str
@@ -60,6 +60,7 @@
from http.cookies import Morsel
from io import StringIO
from collections import OrderedDict
+ from collections.abc import Callable, Mapping, MutableMapping
builtin_str = str
str = str
diff --git a/requests/cookies.py b/requests/cookies.py
index ab3c88b9bf..56fccd9c25 100644
--- a/requests/cookies.py
+++ b/requests/cookies.py
@@ -12,10 +12,9 @@
import copy
import time
import calendar
-import collections
from ._internal_utils import to_native_string
-from .compat import cookielib, urlparse, urlunparse, Morsel
+from .compat import cookielib, urlparse, urlunparse, Morsel, MutableMapping
try:
import threading
@@ -169,7 +168,7 @@ class CookieConflictError(RuntimeError):
"""
-class RequestsCookieJar(cookielib.CookieJar, collections.MutableMapping):
+class RequestsCookieJar(cookielib.CookieJar, MutableMapping):
"""Compatibility class; is a cookielib.CookieJar, but exposes a dict
interface.
@@ -415,9 +414,14 @@ def __setstate__(self, state):
def copy(self):
"""Return a copy of this RequestsCookieJar."""
new_cj = RequestsCookieJar()
+ new_cj.set_policy(self.get_policy())
new_cj.update(self)
return new_cj
+ def get_policy(self):
+ """Return the CookiePolicy instance used."""
+ return self._policy
+
def _copy_cookie_jar(jar):
if jar is None:
@@ -440,20 +444,21 @@ def create_cookie(name, value, **kwargs):
By default, the pair of `name` and `value` will be set for the domain ''
and sent on every request (this is sometimes called a "supercookie").
"""
- result = dict(
- version=0,
- name=name,
- value=value,
- port=None,
- domain='',
- path='/',
- secure=False,
- expires=None,
- discard=True,
- comment=None,
- comment_url=None,
- rest={'HttpOnly': None},
- rfc2109=False,)
+ result = {
+ 'version': 0,
+ 'name': name,
+ 'value': value,
+ 'port': None,
+ 'domain': '',
+ 'path': '/',
+ 'secure': False,
+ 'expires': None,
+ 'discard': True,
+ 'comment': None,
+ 'comment_url': None,
+ 'rest': {'HttpOnly': None},
+ 'rfc2109': False,
+ }
badargs = set(kwargs) - set(result)
if badargs:
@@ -507,6 +512,7 @@ def cookiejar_from_dict(cookie_dict, cookiejar=None, overwrite=True):
:param cookiejar: (optional) A cookiejar to add the cookies to.
:param overwrite: (optional) If False, will not replace cookies
already in the jar with new ones.
+ :rtype: CookieJar
"""
if cookiejar is None:
cookiejar = RequestsCookieJar()
@@ -525,6 +531,7 @@ def merge_cookies(cookiejar, cookies):
:param cookiejar: CookieJar object to add the cookies to.
:param cookies: Dictionary or CookieJar object to be added.
+ :rtype: CookieJar
"""
if not isinstance(cookiejar, cookielib.CookieJar):
raise ValueError('You can only merge into CookieJar')
diff --git a/requests/exceptions.py b/requests/exceptions.py
index be7eaed6b9..a80cad80f1 100644
--- a/requests/exceptions.py
+++ b/requests/exceptions.py
@@ -85,6 +85,10 @@ class InvalidHeader(RequestException, ValueError):
"""The header value provided was somehow invalid."""
+class InvalidProxyURL(InvalidURL):
+ """The proxy URL provided is invalid."""
+
+
class ChunkedEncodingError(RequestException):
"""The server declared chunked encoding but sent an invalid chunk."""
diff --git a/requests/help.py b/requests/help.py
index 5440ee614e..e53d35ef6d 100644
--- a/requests/help.py
+++ b/requests/help.py
@@ -13,7 +13,7 @@
from . import __version__ as requests_version
try:
- from .packages.urllib3.contrib import pyopenssl
+ from urllib3.contrib import pyopenssl
except ImportError:
pyopenssl = None
OpenSSL = None
@@ -89,8 +89,7 @@ def info():
'version': getattr(idna, '__version__', ''),
}
- # OPENSSL_VERSION_NUMBER doesn't exist in the Python 2.6 ssl module.
- system_ssl = getattr(ssl, 'OPENSSL_VERSION_NUMBER', None)
+ system_ssl = ssl.OPENSSL_VERSION_NUMBER
system_ssl_info = {
'version': '%x' % system_ssl if system_ssl is not None else ''
}
diff --git a/requests/hooks.py b/requests/hooks.py
index 32b32de750..7a51f212c8 100644
--- a/requests/hooks.py
+++ b/requests/hooks.py
@@ -15,14 +15,14 @@
def default_hooks():
- return dict((event, []) for event in HOOKS)
+ return {event: [] for event in HOOKS}
# TODO: response is the only one
def dispatch_hook(key, hooks, hook_data, **kwargs):
"""Dispatches a hook dictionary on a given piece of data."""
- hooks = hooks or dict()
+ hooks = hooks or {}
hooks = hooks.get(key)
if hooks:
if hasattr(hooks, '__call__'):
diff --git a/requests/models.py b/requests/models.py
index 4041cac3f0..3dded57eff 100644
--- a/requests/models.py
+++ b/requests/models.py
@@ -7,7 +7,6 @@
This module contains the primary objects that power Requests.
"""
-import collections
import datetime
import sys
@@ -37,6 +36,7 @@
stream_decode_response_unicode, to_key_val_list, parse_header_links,
iter_slices, guess_json_utf, super_len, check_header_validity)
from .compat import (
+ Callable, Mapping,
cookielib, urlunparse, urlsplit, urlencode, str, bytes,
is_py2, chardet, builtin_str, basestring)
from .compat import json as complexjson
@@ -155,8 +155,12 @@ def _encode_files(files, data):
if isinstance(fp, (str, bytes, bytearray)):
fdata = fp
- else:
+ elif hasattr(fp, 'read'):
fdata = fp.read()
+ elif fp is None:
+ continue
+ else:
+ fdata = fp
rf = RequestField(name=k, data=fdata, filename=fn, headers=fh)
rf.make_multipart(content_type=ft)
@@ -174,10 +178,10 @@ def register_hook(self, event, hook):
if event not in self.hooks:
raise ValueError('Unsupported event specified, with event name "%s"' % (event))
- if isinstance(hook, collections.Callable):
+ if isinstance(hook, Callable):
self.hooks[event].append(hook)
elif hasattr(hook, '__iter__'):
- self.hooks[event].extend(h for h in hook if isinstance(h, collections.Callable))
+ self.hooks[event].extend(h for h in hook if isinstance(h, Callable))
def deregister_hook(self, event, hook):
"""Deregister a previously registered hook.
@@ -200,9 +204,13 @@ class Request(RequestHooksMixin):
:param url: URL to send.
:param headers: dictionary of headers to send.
:param files: dictionary of {filename: fileobject} files to multipart upload.
- :param data: the body to attach to the request. If a dictionary is provided, form-encoding will take place.
+ :param data: the body to attach to the request. If a dictionary or
+ list of tuples ``[(key, value)]`` is provided, form-encoding will
+ take place.
:param json: json for the body to attach to the request (if files or data is not specified).
- :param params: dictionary of URL parameters to append to the URL.
+ :param params: URL parameters to append to the URL. If a dictionary or
+ list of tuples ``[(key, value)]`` is provided, form-encoding will
+ take place.
:param auth: Auth handler or (user, pass) tuple.
:param cookies: dictionary or CookieJar of cookies to attach to this request.
:param hooks: dictionary of callback hooks, for internal usage.
@@ -210,7 +218,7 @@ class Request(RequestHooksMixin):
Usage::
>>> import requests
- >>> req = requests.Request('GET', 'http://httpbin.org/get')
+ >>> req = requests.Request('GET', 'https://httpbin.org/get')
>>> req.prepare()
"""
@@ -270,7 +278,7 @@ class PreparedRequest(RequestEncodingMixin, RequestHooksMixin):
Usage::
>>> import requests
- >>> req = requests.Request('GET', 'http://httpbin.org/get')
+ >>> req = requests.Request('GET', 'https://httpbin.org/get')
>>> r = req.prepare()
@@ -461,7 +469,7 @@ def prepare_body(self, data, files, json=None):
is_stream = all([
hasattr(data, '__iter__'),
- not isinstance(data, (basestring, list, tuple, collections.Mapping))
+ not isinstance(data, (basestring, list, tuple, Mapping))
])
try:
@@ -644,10 +652,7 @@ def __getstate__(self):
if not self._content_consumed:
self.content
- return dict(
- (attr, getattr(self, attr, None))
- for attr in self.__attrs__
- )
+ return {attr: getattr(self, attr, None) for attr in self.__attrs__}
def __setstate__(self, state):
for name, value in state.items():
@@ -686,11 +691,11 @@ def __iter__(self):
@property
def ok(self):
- """Returns True if :attr:`status_code` is less than 400.
+ """Returns True if :attr:`status_code` is less than 400, False if not.
This attribute checks if the status code of the response is between
400 and 600 to see if there was a client error or a server error. If
- the status code, is between 200 and 400, this will return True. This
+ the status code is between 200 and 400, this will return True. This
is **not** a check to see if the response code is ``200 OK``.
"""
try:
@@ -820,7 +825,7 @@ def content(self):
if self.status_code == 0 or self.raw is None:
self._content = None
else:
- self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()
+ self._content = b''.join(self.iter_content(CONTENT_CHUNK_SIZE)) or b''
self._content_consumed = True
# don't need to release the connection; that's been handled by urllib3
diff --git a/requests/sessions.py b/requests/sessions.py
index 6570e73349..a448bd83f2 100644
--- a/requests/sessions.py
+++ b/requests/sessions.py
@@ -8,13 +8,12 @@
requests (cookies, auth, proxies).
"""
import os
-import platform
+import sys
import time
-from collections import Mapping
from datetime import timedelta
from .auth import _basic_auth_str
-from .compat import cookielib, is_py3, OrderedDict, urljoin, urlparse
+from .compat import cookielib, is_py3, OrderedDict, urljoin, urlparse, Mapping
from .cookies import (
cookiejar_from_dict, extract_cookies_to_jar, RequestsCookieJar, merge_cookies)
from .models import Request, PreparedRequest, DEFAULT_REDIRECT_LIMIT
@@ -38,8 +37,8 @@
from .models import REDIRECT_STATI
# Preferred clock, based on which one is more accurate on a given system.
-if platform.system() == 'Windows':
- try: # Python 3.3+
+if sys.platform == 'win32':
+ try: # Python 3.4+
preferred_clock = time.perf_counter
except AttributeError: # Earlier than Python 3.
preferred_clock = time.clock
@@ -116,6 +115,22 @@ def get_redirect_target(self, resp):
return to_native_string(location, 'utf8')
return None
+ def should_strip_auth(self, old_url, new_url):
+ """Decide whether Authorization header should be removed when redirecting"""
+ old_parsed = urlparse(old_url)
+ new_parsed = urlparse(new_url)
+ if old_parsed.hostname != new_parsed.hostname:
+ return True
+ # Special case: allow http -> https redirect when using the standard
+ # ports. This isn't specified by RFC 7235, but is kept to avoid
+ # breaking backwards compatibility with older versions of requests
+ # that allowed any redirects on the same host.
+ if (old_parsed.scheme == 'http' and old_parsed.port in (80, None)
+ and new_parsed.scheme == 'https' and new_parsed.port in (443, None)):
+ return False
+ # Standard case: root URI must match
+ return old_parsed.port != new_parsed.port or old_parsed.scheme != new_parsed.scheme
+
def resolve_redirects(self, resp, req, stream=False, timeout=None,
verify=True, cert=None, proxies=None, yield_requests=False, **adapter_kwargs):
"""Receives a Response. Returns a generator of Responses or Requests."""
@@ -123,6 +138,7 @@ def resolve_redirects(self, resp, req, stream=False, timeout=None,
hist = [] # keep track of history
url = self.get_redirect_target(resp)
+ previous_fragment = urlparse(req.url).fragment
while url:
prepared_request = req.copy()
@@ -147,8 +163,12 @@ def resolve_redirects(self, resp, req, stream=False, timeout=None,
parsed_rurl = urlparse(resp.url)
url = '%s:%s' % (to_native_string(parsed_rurl.scheme), url)
- # The scheme should be lower case...
+ # Normalize url case and attach previous fragment if needed (RFC 7231 7.1.2)
parsed = urlparse(url)
+ if parsed.fragment == '' and previous_fragment:
+ parsed = parsed._replace(fragment=previous_fragment)
+ elif parsed.fragment:
+ previous_fragment = parsed.fragment
url = parsed.geturl()
# Facilitate relative 'location' headers, as allowed by RFC 7231.
@@ -232,14 +252,10 @@ def rebuild_auth(self, prepared_request, response):
headers = prepared_request.headers
url = prepared_request.url
- if 'Authorization' in headers:
+ if 'Authorization' in headers and self.should_strip_auth(response.request.url, url):
# If we get redirected to a new host, we should strip out any
# authentication headers.
- original_parsed = urlparse(response.request.url)
- redirect_parsed = urlparse(url)
-
- if (original_parsed.hostname != redirect_parsed.hostname):
- del headers['Authorization']
+ del headers['Authorization']
# .netrc might have more auth for us on our new host.
new_auth = get_netrc_auth(url) if self.trust_env else None
@@ -295,7 +311,7 @@ def rebuild_method(self, prepared_request, response):
"""
method = prepared_request.method
- # http://tools.ietf.org/html/rfc7231#section-6.4.4
+ # https://tools.ietf.org/html/rfc7231#section-6.4.4
if response.status_code == codes.see_other and method != 'HEAD':
method = 'GET'
@@ -321,13 +337,13 @@ class Session(SessionRedirectMixin):
>>> import requests
>>> s = requests.Session()
- >>> s.get('http://httpbin.org/get')
+ >>> s.get('https://httpbin.org/get')
Or as a context manager::
>>> with requests.Session() as s:
- >>> s.get('http://httpbin.org/get')
+ >>> s.get('https://httpbin.org/get')
"""
@@ -449,8 +465,8 @@ def request(self, method, url,
:param url: URL for the new :class:`Request` object.
:param params: (optional) Dictionary or bytes to be sent in the query
string for the :class:`Request`.
- :param data: (optional) Dictionary, bytes, or file-like object to send
- in the body of the :class:`Request`.
+ :param data: (optional) Dictionary, list of tuples, bytes, or file-like
+ object to send in the body of the :class:`Request`.
:param json: (optional) json to send in the body of the
:class:`Request`.
:param headers: (optional) Dictionary of HTTP Headers to send with the
@@ -546,7 +562,8 @@ def post(self, url, data=None, json=None, **kwargs):
r"""Sends a POST request. Returns :class:`Response` object.
:param url: URL for the new :class:`Request` object.
- :param data: (optional) Dictionary, bytes, or file-like object to send in the body of the :class:`Request`.
+ :param data: (optional) Dictionary, list of tuples, bytes, or file-like
+ object to send in the body of the :class:`Request`.
:param json: (optional) json to send in the body of the :class:`Request`.
:param \*\*kwargs: Optional arguments that ``request`` takes.
:rtype: requests.Response
@@ -558,7 +575,8 @@ def put(self, url, data=None, **kwargs):
r"""Sends a PUT request. Returns :class:`Response` object.
:param url: URL for the new :class:`Request` object.
- :param data: (optional) Dictionary, bytes, or file-like object to send in the body of the :class:`Request`.
+ :param data: (optional) Dictionary, list of tuples, bytes, or file-like
+ object to send in the body of the :class:`Request`.
:param \*\*kwargs: Optional arguments that ``request`` takes.
:rtype: requests.Response
"""
@@ -569,7 +587,8 @@ def patch(self, url, data=None, **kwargs):
r"""Sends a PATCH request. Returns :class:`Response` object.
:param url: URL for the new :class:`Request` object.
- :param data: (optional) Dictionary, bytes, or file-like object to send in the body of the :class:`Request`.
+ :param data: (optional) Dictionary, list of tuples, bytes, or file-like
+ object to send in the body of the :class:`Request`.
:param \*\*kwargs: Optional arguments that ``request`` takes.
:rtype: requests.Response
"""
@@ -696,7 +715,7 @@ def get_adapter(self, url):
"""
for (prefix, adapter) in self.adapters.items():
- if url.lower().startswith(prefix):
+ if url.lower().startswith(prefix.lower()):
return adapter
# Nothing matches :-/
@@ -719,7 +738,7 @@ def mount(self, prefix, adapter):
self.adapters[key] = self.adapters.pop(key)
def __getstate__(self):
- state = dict((attr, getattr(self, attr, None)) for attr in self.__attrs__)
+ state = {attr: getattr(self, attr, None) for attr in self.__attrs__}
return state
def __setstate__(self, state):
@@ -731,7 +750,12 @@ def session():
"""
Returns a :class:`Session` for context-management.
+ .. deprecated:: 1.0.0
+
+ This method has been deprecated since version 1.0.0 and is only kept for
+ backwards compatibility. New code should use :class:`~requests.sessions.Session`
+ to create a session. This may be removed at a future date.
+
:rtype: Session
"""
-
return Session()
diff --git a/requests/status_codes.py b/requests/status_codes.py
index dee89190c0..813e8c4e62 100644
--- a/requests/status_codes.py
+++ b/requests/status_codes.py
@@ -1,5 +1,22 @@
# -*- coding: utf-8 -*-
+r"""
+The ``codes`` object defines a mapping from common names for HTTP statuses
+to their numerical codes, accessible either as attributes or as dictionary
+items.
+
+>>> requests.codes['temporary_redirect']
+307
+>>> requests.codes.teapot
+418
+>>> requests.codes['\o/']
+200
+
+Some codes have multiple names, and both upper- and lower-case versions of
+the names are allowed. For example, ``codes.ok``, ``codes.OK``, and
+``codes.okay`` all correspond to the HTTP status code 200.
+"""
+
from .structures import LookupDict
_codes = {
@@ -84,8 +101,20 @@
codes = LookupDict(name='status_codes')
-for code, titles in _codes.items():
- for title in titles:
- setattr(codes, title, code)
- if not title.startswith(('\\', '/')):
- setattr(codes, title.upper(), code)
+def _init():
+ for code, titles in _codes.items():
+ for title in titles:
+ setattr(codes, title, code)
+ if not title.startswith(('\\', '/')):
+ setattr(codes, title.upper(), code)
+
+ def doc(code):
+ names = ', '.join('``%s``' % n for n in _codes[code])
+ return '* %d: %s' % (code, names)
+
+ global __doc__
+ __doc__ = (__doc__ + '\n' +
+ '\n'.join(doc(code) for code in sorted(_codes))
+ if __doc__ is not None else None)
+
+_init()
diff --git a/requests/structures.py b/requests/structures.py
index 05d2b3f57b..da930e2852 100644
--- a/requests/structures.py
+++ b/requests/structures.py
@@ -7,16 +7,14 @@
Data structures that power Requests.
"""
-import collections
+from .compat import OrderedDict, Mapping, MutableMapping
-from .compat import OrderedDict
-
-class CaseInsensitiveDict(collections.MutableMapping):
+class CaseInsensitiveDict(MutableMapping):
"""A case-insensitive ``dict``-like object.
Implements all methods and operations of
- ``collections.MutableMapping`` as well as dict's ``copy``. Also
+ ``MutableMapping`` as well as dict's ``copy``. Also
provides ``lower_items``.
All keys are expected to be strings. The structure remembers the
@@ -71,7 +69,7 @@ def lower_items(self):
)
def __eq__(self, other):
- if isinstance(other, collections.Mapping):
+ if isinstance(other, Mapping):
other = CaseInsensitiveDict(other)
else:
return NotImplemented
diff --git a/requests/utils.py b/requests/utils.py
index 5c47de9893..0ce7fe115c 100644
--- a/requests/utils.py
+++ b/requests/utils.py
@@ -8,17 +8,17 @@
that are also useful for external consumption.
"""
-import cgi
import codecs
-import collections
import contextlib
import io
import os
-import platform
import re
import socket
import struct
+import sys
+import tempfile
import warnings
+import zipfile
from .__version__ import __version__
from . import certs
@@ -28,7 +28,7 @@
from .compat import (
quote, urlparse, bytes, str, OrderedDict, unquote, getproxies,
proxy_bypass, urlunparse, basestring, integer_types, is_py3,
- proxy_bypass_environment, getproxies_environment)
+ proxy_bypass_environment, getproxies_environment, Mapping)
from .cookies import cookiejar_from_dict
from .structures import CaseInsensitiveDict
from .exceptions import (
@@ -39,19 +39,25 @@
DEFAULT_CA_BUNDLE_PATH = certs.where()
-if platform.system() == 'Windows':
+if sys.platform == 'win32':
# provide a proxy_bypass version on Windows without DNS lookups
def proxy_bypass_registry(host):
- if is_py3:
- import winreg
- else:
- import _winreg as winreg
+ try:
+ if is_py3:
+ import winreg
+ else:
+ import _winreg as winreg
+ except ImportError:
+ return False
+
try:
internetSettings = winreg.OpenKey(winreg.HKEY_CURRENT_USER,
r'Software\Microsoft\Windows\CurrentVersion\Internet Settings')
- proxyEnable = winreg.QueryValueEx(internetSettings,
- 'ProxyEnable')[0]
+ # ProxyEnable could be REG_SZ or REG_DWORD, normalizing it
+ proxyEnable = int(winreg.QueryValueEx(internetSettings,
+ 'ProxyEnable')[0])
+ # ProxyOverride is almost always a string
proxyOverride = winreg.QueryValueEx(internetSettings,
'ProxyOverride')[0]
except OSError:
@@ -167,10 +173,10 @@ def get_netrc_auth(url, raise_errors=False):
for f in NETRC_FILES:
try:
- loc = os.path.expanduser('~/{0}'.format(f))
+ loc = os.path.expanduser('~/{}'.format(f))
except KeyError:
# os.path.expanduser can fail when $HOME is undefined and
- # getpwuid fails. See http://bugs.python.org/issue20164 &
+ # getpwuid fails. See https://bugs.python.org/issue20164 &
# https://github.com/requests/requests/issues/1846
return
@@ -216,6 +222,38 @@ def guess_filename(obj):
return os.path.basename(name)
+def extract_zipped_paths(path):
+ """Replace nonexistent paths that look like they refer to a member of a zip
+ archive with the location of an extracted copy of the target, or else
+ just return the provided path unchanged.
+ """
+ if os.path.exists(path):
+ # this is already a valid path, no need to do anything further
+ return path
+
+ # find the first valid part of the provided path and treat that as a zip archive
+ # assume the rest of the path is the name of a member in the archive
+ archive, member = os.path.split(path)
+ while archive and not os.path.exists(archive):
+ archive, prefix = os.path.split(archive)
+ member = '/'.join([prefix, member])
+
+ if not zipfile.is_zipfile(archive):
+ return path
+
+ zip_file = zipfile.ZipFile(archive)
+ if member not in zip_file.namelist():
+ return path
+
+ # we have a valid zip archive and a valid member of that archive
+ tmp = tempfile.gettempdir()
+ extracted_path = os.path.join(tmp, *member.split('/'))
+ if not os.path.exists(extracted_path):
+ extracted_path = zip_file.extract(member, path=tmp)
+
+ return extracted_path
+
+
def from_key_val_list(value):
"""Take an object and test to see if it can be represented as a
dictionary. Unless it can not be represented as such, return an
@@ -262,7 +300,7 @@ def to_key_val_list(value):
if isinstance(value, (str, bytes, bool, int)):
raise ValueError('cannot encode objects that are not 2-tuples')
- if isinstance(value, collections.Mapping):
+ if isinstance(value, Mapping):
value = value.items()
return list(value)
@@ -407,6 +445,31 @@ def get_encodings_from_content(content):
xml_re.findall(content))
+def _parse_content_type_header(header):
+ """Returns content type and parameters from given header
+
+ :param header: string
+ :return: tuple containing content type and dictionary of
+ parameters
+ """
+
+ tokens = header.split(';')
+ content_type, params = tokens[0].strip(), tokens[1:]
+ params_dict = {}
+ items_to_strip = "\"' "
+
+ for param in params:
+ param = param.strip()
+ if param:
+ key, value = param, True
+ index_of_equals = param.find("=")
+ if index_of_equals != -1:
+ key = param[:index_of_equals].strip(items_to_strip)
+ value = param[index_of_equals + 1:].strip(items_to_strip)
+ params_dict[key.lower()] = value
+ return content_type, params_dict
+
+
def get_encoding_from_headers(headers):
"""Returns encodings from given HTTP Header Dict.
@@ -419,7 +482,7 @@ def get_encoding_from_headers(headers):
if not content_type:
return None
- content_type, params = cgi.parse_header(content_type)
+ content_type, params = _parse_content_type_header(content_type)
if 'charset' in params:
return params['charset'].strip("'\"")
@@ -632,6 +695,8 @@ def should_bypass_proxies(url, no_proxy):
:rtype: bool
"""
+ # Prioritize lowercase environment variables over uppercase
+ # to keep a consistent behaviour with other http projects (curl, wget).
get_proxy = lambda k: os.environ.get(k) or os.environ.get(k.upper())
# First check whether no_proxy is defined. If it is, check that the URL
@@ -639,41 +704,43 @@ def should_bypass_proxies(url, no_proxy):
no_proxy_arg = no_proxy
if no_proxy is None:
no_proxy = get_proxy('no_proxy')
- netloc = urlparse(url).netloc
+ parsed = urlparse(url)
+
+ if parsed.hostname is None:
+ # URLs don't always have hostnames, e.g. file:/// urls.
+ return True
if no_proxy:
# We need to check whether we match here. We need to see if we match
- # the end of the netloc, both with and without the port.
+ # the end of the hostname, both with and without the port.
no_proxy = (
host for host in no_proxy.replace(' ', '').split(',') if host
)
- ip = netloc.split(':')[0]
- if is_ipv4_address(ip):
+ if is_ipv4_address(parsed.hostname):
for proxy_ip in no_proxy:
if is_valid_cidr(proxy_ip):
- if address_in_network(ip, proxy_ip):
+ if address_in_network(parsed.hostname, proxy_ip):
return True
- elif ip == proxy_ip:
+ elif parsed.hostname == proxy_ip:
# If no_proxy ip was defined in plain IP notation instead of cidr notation &
# matches the IP of the index
return True
else:
+ host_with_port = parsed.hostname
+ if parsed.port:
+ host_with_port += ':{}'.format(parsed.port)
+
for host in no_proxy:
- if netloc.endswith(host) or netloc.split(':')[0].endswith(host):
+ if parsed.hostname.endswith(host) or host_with_port.endswith(host):
# The URL does match something in no_proxy, so we don't want
# to apply the proxies on this URL.
return True
- # If the system proxy settings indicate that this URL should be bypassed,
- # don't proxy.
- # The proxy_bypass function is incredibly buggy on OS X in early versions
- # of Python 2.6, so allow this call to fail. Only catch the specific
- # exceptions we've seen, though: this call failing in other ways can reveal
- # legitimate problems.
with set_environ('no_proxy', no_proxy_arg):
+ # parsed.hostname can be `None` in cases such as a file URI.
try:
- bypass = proxy_bypass(netloc)
+ bypass = proxy_bypass(parsed.hostname)
except (TypeError, socket.gaierror):
bypass = False
@@ -743,7 +810,7 @@ def default_headers():
def parse_header_links(value):
- """Return a dict of parsed link headers proxies.
+ """Return a list of parsed link headers proxies.
i.e. Link: ; rel=front; type="image/jpeg",; rel=back;type="image/jpeg"
@@ -754,6 +821,10 @@ def parse_header_links(value):
replace_chars = ' \'"'
+ value = value.strip(replace_chars)
+ if not value:
+ return links
+
for val in re.split(', *<', value):
try:
url, params = val.split(';', 1)
diff --git a/requirements.txt b/requirements.txt
deleted file mode 100644
index 8d79283fd9..0000000000
--- a/requirements.txt
+++ /dev/null
@@ -1,16 +0,0 @@
--e .[socks]
-pytest>=2.8.0
-codecov
-pytest-httpbin==0.0.7
-pytest-mock
-pytest-cov
-pytest-xdist
-alabaster
-readme_renderer
-Sphinx<=1.5.5
-PySocks
-setuptools>=18.5
-docutils
-flake8
-tox
-detox
diff --git a/setup.cfg b/setup.cfg
index 2a9acf13da..ed8a958e0a 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -1,2 +1,5 @@
[bdist_wheel]
universal = 1
+
+[metadata]
+license_file = LICENSE
diff --git a/setup.py b/setup.py
index ed4892d41f..4e2ad936eb 100755
--- a/setup.py
+++ b/setup.py
@@ -1,5 +1,5 @@
#!/usr/bin/env python
-
+# Learn more: https://github.com/kennethreitz/setup.py
import os
import re
import sys
@@ -43,27 +43,35 @@ def run_tests(self):
requires = [
'chardet>=3.0.2,<3.1.0',
- 'idna>=2.5,<2.7',
- 'urllib3>=1.21.1,<1.23',
+ 'idna>=2.5,<2.8',
+ 'urllib3>=1.21.1,<1.25',
'certifi>=2017.4.17'
]
-test_requirements = ['pytest-httpbin==0.0.7', 'pytest-cov', 'pytest-mock', 'pytest-xdist', 'PySocks>=1.5.6, !=1.5.7', 'pytest>=2.8.0']
+test_requirements = [
+ 'pytest-httpbin==0.0.7',
+ 'pytest-cov',
+ 'pytest-mock',
+ 'pytest-xdist',
+ 'PySocks>=1.5.6, !=1.5.7',
+ 'pytest>=2.8.0'
+]
about = {}
with open(os.path.join(here, 'requests', '__version__.py'), 'r', 'utf-8') as f:
exec(f.read(), about)
-with open('README.rst', 'r', 'utf-8') as f:
+with open('README.md', 'r', 'utf-8') as f:
readme = f.read()
-with open('HISTORY.rst', 'r', 'utf-8') as f:
+with open('HISTORY.md', 'r', 'utf-8') as f:
history = f.read()
setup(
name=about['__title__'],
version=about['__version__'],
description=about['__description__'],
- long_description=readme + '\n\n' + history,
+ long_description=readme,
+ long_description_content_type='text/markdown',
author=about['__author__'],
author_email=about['__author_email__'],
url=about['__url__'],
@@ -71,30 +79,31 @@ def run_tests(self):
package_data={'': ['LICENSE', 'NOTICE'], 'requests': ['*.pem']},
package_dir={'requests': 'requests'},
include_package_data=True,
+ python_requires=">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*",
install_requires=requires,
license=about['__license__'],
zip_safe=False,
- classifiers=(
+ classifiers=[
'Development Status :: 5 - Production/Stable',
'Intended Audience :: Developers',
'Natural Language :: English',
'License :: OSI Approved :: Apache Software License',
'Programming Language :: Python',
- 'Programming Language :: Python :: 2.6',
+ 'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
+ 'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: Implementation :: CPython',
'Programming Language :: Python :: Implementation :: PyPy'
- ),
+ ],
cmdclass={'test': PyTest},
tests_require=test_requirements,
extras_require={
- 'security': ['pyOpenSSL>=0.14', 'cryptography>=1.3.4', 'idna>=2.0.0'],
+ 'security': ['pyOpenSSL >= 0.14', 'cryptography>=1.3.4', 'idna>=2.0.0'],
'socks': ['PySocks>=1.5.6, !=1.5.7'],
- 'socks:sys_platform == "win32" and (python_version == "2.7" or python_version == "2.6")': ['win_inet_pton'],
+ 'socks:sys_platform == "win32" and python_version == "2.7"': ['win_inet_pton'],
},
)
-
diff --git a/tests/test_help.py b/tests/test_help.py
index c11d43f341..3beb65f30a 100644
--- a/tests/test_help.py
+++ b/tests/test_help.py
@@ -7,15 +7,6 @@
from requests.help import info
-@pytest.mark.skipif(sys.version_info[:2] != (2,6), reason="Only run on Python 2.6")
-def test_system_ssl_py26():
- """OPENSSL_VERSION_NUMBER isn't provided in Python 2.6, verify we don't
- blow up in this case.
- """
- assert info()['system_ssl'] == {'version': ''}
-
-
-@pytest.mark.skipif(sys.version_info < (2,7), reason="Only run on Python 2.7+")
def test_system_ssl():
"""Verify we're actually setting system_ssl when it should be available."""
assert info()['system_ssl']['version'] != ''
diff --git a/tests/test_lowlevel.py b/tests/test_lowlevel.py
index c87234bc37..82c3b25a1d 100644
--- a/tests/test_lowlevel.py
+++ b/tests/test_lowlevel.py
@@ -16,7 +16,7 @@ def test_chunked_upload():
data = iter([b'a', b'b', b'c'])
with server as (host, port):
- url = 'http://{0}:{1}/'.format(host, port)
+ url = 'http://{}:{}/'.format(host, port)
r = requests.post(url, data=data, stream=True)
close_server.set() # release server block
@@ -77,7 +77,7 @@ def digest_response_handler(sock):
server = Server(digest_response_handler, wait_to_close_event=close_server)
with server as (host, port):
- url = 'http://{0}:{1}/'.format(host, port)
+ url = 'http://{}:{}/'.format(host, port)
r = requests.get(url, auth=auth)
# Verify server succeeded in authenticating.
assert r.status_code == 200
@@ -127,7 +127,7 @@ def digest_failed_response_handler(sock):
server = Server(digest_failed_response_handler, wait_to_close_event=close_server)
with server as (host, port):
- url = 'http://{0}:{1}/'.format(host, port)
+ url = 'http://{}:{}/'.format(host, port)
r = requests.get(url, auth=auth)
# Verify server didn't authenticate us.
assert r.status_code == 401
@@ -164,7 +164,7 @@ def digest_response_handler(sock):
server = Server(digest_response_handler, wait_to_close_event=close_server)
with server as (host, port):
- url = 'http://{0}:{1}/'.format(host, port)
+ url = 'http://{}:{}/'.format(host, port)
r = requests.get(url, auth=auth)
# Verify server didn't receive auth from us.
assert r.status_code == 200
@@ -181,17 +181,17 @@ def digest_response_handler(sock):
_proxy_combos = []
for prefix, schemes in _schemes_by_var_prefix:
for scheme in schemes:
- _proxy_combos.append(("{0}_proxy".format(prefix), scheme))
+ _proxy_combos.append(("{}_proxy".format(prefix), scheme))
_proxy_combos += [(var.upper(), scheme) for var, scheme in _proxy_combos]
@pytest.mark.parametrize("var,scheme", _proxy_combos)
def test_use_proxy_from_environment(httpbin, var, scheme):
- url = "{0}://httpbin.org".format(scheme)
+ url = "{}://httpbin.org".format(scheme)
fake_proxy = Server() # do nothing with the requests; just close the socket
with fake_proxy as (host, port):
- proxy_url = "socks5://{0}:{1}".format(host, port)
+ proxy_url = "socks5://{}:{}".format(host, port)
kwargs = {var: proxy_url}
with override_environ(**kwargs):
# fake proxy's lack of response will cause a ConnectionError
@@ -212,7 +212,7 @@ def test_redirect_rfc1808_to_non_ascii_location():
def redirect_resp_handler(sock):
consume_socket_content(sock, timeout=0.5)
- location = u'//{0}:{1}/{2}'.format(host, port, path)
+ location = u'//{}:{}/{}'.format(host, port, path)
sock.send(
b'HTTP/1.1 301 Moved Permanently\r\n'
b'Content-Length: 0\r\n'
@@ -226,12 +226,84 @@ def redirect_resp_handler(sock):
server = Server(redirect_resp_handler, wait_to_close_event=close_server)
with server as (host, port):
- url = u'http://{0}:{1}'.format(host, port)
+ url = u'http://{}:{}'.format(host, port)
r = requests.get(url=url, allow_redirects=True)
assert r.status_code == 200
assert len(r.history) == 1
assert r.history[0].status_code == 301
assert redirect_request[0].startswith(b'GET /' + expected_path + b' HTTP/1.1')
- assert r.url == u'{0}/{1}'.format(url, expected_path.decode('ascii'))
+ assert r.url == u'{}/{}'.format(url, expected_path.decode('ascii'))
+
+ close_server.set()
+
+def test_fragment_not_sent_with_request():
+ """Verify that the fragment portion of a URI isn't sent to the server."""
+ def response_handler(sock):
+ req = consume_socket_content(sock, timeout=0.5)
+ sock.send(
+ b'HTTP/1.1 200 OK\r\n'
+ b'Content-Length: '+bytes(len(req))+b'\r\n'
+ b'\r\n'+req
+ )
+
+ close_server = threading.Event()
+ server = Server(response_handler, wait_to_close_event=close_server)
+
+ with server as (host, port):
+ url = 'http://{}:{}/path/to/thing/#view=edit&token=hunter2'.format(host, port)
+ r = requests.get(url)
+ raw_request = r.content
+
+ assert r.status_code == 200
+ headers, body = raw_request.split(b'\r\n\r\n', 1)
+ status_line, headers = headers.split(b'\r\n', 1)
+
+ assert status_line == b'GET /path/to/thing/ HTTP/1.1'
+ for frag in (b'view', b'edit', b'token', b'hunter2'):
+ assert frag not in headers
+ assert frag not in body
+
+ close_server.set()
+
+def test_fragment_update_on_redirect():
+ """Verify we only append previous fragment if one doesn't exist on new
+ location. If a new fragment is encountered in a Location header, it should
+ be added to all subsequent requests.
+ """
+
+ def response_handler(sock):
+ consume_socket_content(sock, timeout=0.5)
+ sock.send(
+ b'HTTP/1.1 302 FOUND\r\n'
+ b'Content-Length: 0\r\n'
+ b'Location: /get#relevant-section\r\n\r\n'
+ )
+ consume_socket_content(sock, timeout=0.5)
+ sock.send(
+ b'HTTP/1.1 302 FOUND\r\n'
+ b'Content-Length: 0\r\n'
+ b'Location: /final-url/\r\n\r\n'
+ )
+ consume_socket_content(sock, timeout=0.5)
+ sock.send(
+ b'HTTP/1.1 200 OK\r\n\r\n'
+ )
+
+ close_server = threading.Event()
+ server = Server(response_handler, wait_to_close_event=close_server)
+
+ with server as (host, port):
+ url = 'http://{}:{}/path/to/thing/#view=edit&token=hunter2'.format(host, port)
+ r = requests.get(url)
+ raw_request = r.content
+
+ assert r.status_code == 200
+ assert len(r.history) == 2
+ assert r.history[0].request.url == url
+
+ # Verify we haven't overwritten the location with our previous fragment.
+ assert r.history[1].request.url == 'http://{}:{}/get#relevant-section'.format(host, port)
+ # Verify previous fragment is used and not the original.
+ assert r.url == 'http://{}:{}/final-url/#relevant-section'.format(host, port)
close_server.set()
diff --git a/tests/test_requests.py b/tests/test_requests.py
index a2b2213f56..4cf5c97351 100644
--- a/tests/test_requests.py
+++ b/tests/test_requests.py
@@ -23,12 +23,13 @@
from requests.exceptions import (
ConnectionError, ConnectTimeout, InvalidSchema, InvalidURL,
MissingSchema, ReadTimeout, Timeout, RetryError, TooManyRedirects,
- ProxyError, InvalidHeader, UnrewindableBodyError, SSLError)
+ ProxyError, InvalidHeader, UnrewindableBodyError, SSLError, InvalidProxyURL)
from requests.models import PreparedRequest
from requests.structures import CaseInsensitiveDict
from requests.sessions import SessionRedirectMixin
from requests.models import urlencode
from requests.hooks import default_hooks
+from requests.compat import MutableMapping
from .compat import StringIO, u
from .utils import override_environ
@@ -54,6 +55,8 @@
class TestRequests:
+ digest_auth_algo = ('MD5', 'SHA-256', 'SHA-512')
+
def test_entry_points(self):
requests.session
@@ -155,7 +158,7 @@ def test_mixed_case_scheme_acceptable(self, httpbin, scheme):
url = scheme + parts.netloc + parts.path
r = requests.Request('GET', url)
r = s.send(r.prepare())
- assert r.status_code == 200, 'failed for scheme {0}'.format(scheme)
+ assert r.status_code == 200, 'failed for scheme {}'.format(scheme)
def test_HTTP_200_OK_GET_ALTERNATIVE(self, httpbin):
r = requests.Request('GET', httpbin('get'))
@@ -294,6 +297,14 @@ def test_transfer_enc_removal_on_redirect(self, httpbin):
for header in purged_headers:
assert header not in next_resp.request.headers
+ def test_fragment_maintained_on_redirect(self, httpbin):
+ fragment = "#view=edit&token=hunter2"
+ r = requests.get(httpbin('redirect-to?url=get')+fragment)
+
+ assert len(r.history) > 0
+ assert r.history[0].request.url == httpbin('redirect-to?url=get')+fragment
+ assert r.url == httpbin('get')+fragment
+
def test_HTTP_200_OK_GET_WITH_PARAMS(self, httpbin):
heads = {'User-agent': 'Mozilla/5.0'}
@@ -526,6 +537,19 @@ def test_proxy_error(self):
with pytest.raises(ProxyError):
requests.get('http://localhost:1', proxies={'http': 'non-resolvable-address'})
+ def test_proxy_error_on_bad_url(self, httpbin, httpbin_secure):
+ with pytest.raises(InvalidProxyURL):
+ requests.get(httpbin_secure(), proxies={'https': 'http:/badproxyurl:3128'})
+
+ with pytest.raises(InvalidProxyURL):
+ requests.get(httpbin(), proxies={'http': 'http://:8080'})
+
+ with pytest.raises(InvalidProxyURL):
+ requests.get(httpbin_secure(), proxies={'https': 'https://'})
+
+ with pytest.raises(InvalidProxyURL):
+ requests.get(httpbin(), proxies={'http': 'http:///example.com:8080'})
+
def test_basicauth_with_netrc(self, httpbin):
auth = ('user', 'pass')
wrong_auth = ('wronguser', 'wrongpass')
@@ -561,70 +585,79 @@ def get_netrc_auth_mock(url):
def test_DIGEST_HTTP_200_OK_GET(self, httpbin):
- auth = HTTPDigestAuth('user', 'pass')
- url = httpbin('digest-auth', 'auth', 'user', 'pass')
+ for authtype in self.digest_auth_algo:
+ auth = HTTPDigestAuth('user', 'pass')
+ url = httpbin('digest-auth', 'auth', 'user', 'pass', authtype, 'never')
- r = requests.get(url, auth=auth)
- assert r.status_code == 200
+ r = requests.get(url, auth=auth)
+ assert r.status_code == 200
- r = requests.get(url)
- assert r.status_code == 401
+ r = requests.get(url)
+ assert r.status_code == 401
+ print(r.headers['WWW-Authenticate'])
- s = requests.session()
- s.auth = HTTPDigestAuth('user', 'pass')
- r = s.get(url)
- assert r.status_code == 200
+ s = requests.session()
+ s.auth = HTTPDigestAuth('user', 'pass')
+ r = s.get(url)
+ assert r.status_code == 200
def test_DIGEST_AUTH_RETURNS_COOKIE(self, httpbin):
- url = httpbin('digest-auth', 'auth', 'user', 'pass')
- auth = HTTPDigestAuth('user', 'pass')
- r = requests.get(url)
- assert r.cookies['fake'] == 'fake_value'
- r = requests.get(url, auth=auth)
- assert r.status_code == 200
+ for authtype in self.digest_auth_algo:
+ url = httpbin('digest-auth', 'auth', 'user', 'pass', authtype)
+ auth = HTTPDigestAuth('user', 'pass')
+ r = requests.get(url)
+ assert r.cookies['fake'] == 'fake_value'
+
+ r = requests.get(url, auth=auth)
+ assert r.status_code == 200
def test_DIGEST_AUTH_SETS_SESSION_COOKIES(self, httpbin):
- url = httpbin('digest-auth', 'auth', 'user', 'pass')
- auth = HTTPDigestAuth('user', 'pass')
- s = requests.Session()
- s.get(url, auth=auth)
- assert s.cookies['fake'] == 'fake_value'
+
+ for authtype in self.digest_auth_algo:
+ url = httpbin('digest-auth', 'auth', 'user', 'pass', authtype)
+ auth = HTTPDigestAuth('user', 'pass')
+ s = requests.Session()
+ s.get(url, auth=auth)
+ assert s.cookies['fake'] == 'fake_value'
def test_DIGEST_STREAM(self, httpbin):
- auth = HTTPDigestAuth('user', 'pass')
- url = httpbin('digest-auth', 'auth', 'user', 'pass')
+ for authtype in self.digest_auth_algo:
+ auth = HTTPDigestAuth('user', 'pass')
+ url = httpbin('digest-auth', 'auth', 'user', 'pass', authtype)
- r = requests.get(url, auth=auth, stream=True)
- assert r.raw.read() != b''
+ r = requests.get(url, auth=auth, stream=True)
+ assert r.raw.read() != b''
- r = requests.get(url, auth=auth, stream=False)
- assert r.raw.read() == b''
+ r = requests.get(url, auth=auth, stream=False)
+ assert r.raw.read() == b''
def test_DIGESTAUTH_WRONG_HTTP_401_GET(self, httpbin):
- auth = HTTPDigestAuth('user', 'wrongpass')
- url = httpbin('digest-auth', 'auth', 'user', 'pass')
+ for authtype in self.digest_auth_algo:
+ auth = HTTPDigestAuth('user', 'wrongpass')
+ url = httpbin('digest-auth', 'auth', 'user', 'pass', authtype)
- r = requests.get(url, auth=auth)
- assert r.status_code == 401
+ r = requests.get(url, auth=auth)
+ assert r.status_code == 401
- r = requests.get(url)
- assert r.status_code == 401
+ r = requests.get(url)
+ assert r.status_code == 401
- s = requests.session()
- s.auth = auth
- r = s.get(url)
- assert r.status_code == 401
+ s = requests.session()
+ s.auth = auth
+ r = s.get(url)
+ assert r.status_code == 401
def test_DIGESTAUTH_QUOTES_QOP_VALUE(self, httpbin):
- auth = HTTPDigestAuth('user', 'pass')
- url = httpbin('digest-auth', 'auth', 'user', 'pass')
+ for authtype in self.digest_auth_algo:
+ auth = HTTPDigestAuth('user', 'pass')
+ url = httpbin('digest-auth', 'auth', 'user', 'pass', authtype)
- r = requests.get(url, auth=auth)
- assert '"auth"' in r.request.headers['Authorization']
+ r = requests.get(url, auth=auth)
+ assert '"auth"' in r.request.headers['Authorization']
def test_POSTBIN_GET_POST_FILES(self, httpbin):
@@ -634,7 +667,7 @@ def test_POSTBIN_GET_POST_FILES(self, httpbin):
post1 = requests.post(url, data={'some': 'data'})
assert post1.status_code == 200
- with open('requirements.txt') as f:
+ with open('Pipfile') as f:
post2 = requests.post(url, files={'some': f})
assert post2.status_code == 200
@@ -644,6 +677,14 @@ def test_POSTBIN_GET_POST_FILES(self, httpbin):
with pytest.raises(ValueError):
requests.post(url, files=['bad file data'])
+ def test_invalid_files_input(self, httpbin):
+
+ url = httpbin('post')
+ post = requests.post(url,
+ files={"random-file-1": None, "random-file-2": 1})
+ assert b'name="random-file-1"' not in post.request.body
+ assert b'name="random-file-2"' in post.request.body
+
def test_POSTBIN_SEEKED_OBJECT_WITH_NO_ITER(self, httpbin):
class TestStream(object):
@@ -694,7 +735,7 @@ def test_POSTBIN_GET_POST_FILES_WITH_DATA(self, httpbin):
post1 = requests.post(url, data={'some': 'data'})
assert post1.status_code == 200
- with open('requirements.txt') as f:
+ with open('Pipfile') as f:
post2 = requests.post(url, data={'some': 'data'}, files={'some': f})
assert post2.status_code == 200
@@ -705,7 +746,7 @@ def test_POSTBIN_GET_POST_FILES_WITH_DATA(self, httpbin):
requests.post(url, files=['bad file data'])
def test_post_with_custom_mapping(self, httpbin):
- class CustomMapping(collections.MutableMapping):
+ class CustomMapping(MutableMapping):
def __init__(self, *args, **kwargs):
self.data = dict(*args, **kwargs)
@@ -731,7 +772,7 @@ def __len__(self):
def test_conflicting_post_params(self, httpbin):
url = httpbin('post')
- with open('requirements.txt') as f:
+ with open('Pipfile') as f:
pytest.raises(ValueError, "requests.post(url, data='[{\"some\": \"data\"}]', files={'some': f})")
pytest.raises(ValueError, "requests.post(url, data=u('[{\"some\": \"data\"}]'), files={'some': f})")
@@ -775,17 +816,17 @@ def test_invalid_ca_certificate_path(self, httpbin_secure):
INVALID_PATH = '/garbage'
with pytest.raises(IOError) as e:
requests.get(httpbin_secure(), verify=INVALID_PATH)
- assert str(e.value) == 'Could not find a suitable TLS CA certificate bundle, invalid path: {0}'.format(INVALID_PATH)
+ assert str(e.value) == 'Could not find a suitable TLS CA certificate bundle, invalid path: {}'.format(INVALID_PATH)
def test_invalid_ssl_certificate_files(self, httpbin_secure):
INVALID_PATH = '/garbage'
with pytest.raises(IOError) as e:
requests.get(httpbin_secure(), cert=INVALID_PATH)
- assert str(e.value) == 'Could not find the TLS certificate file, invalid path: {0}'.format(INVALID_PATH)
+ assert str(e.value) == 'Could not find the TLS certificate file, invalid path: {}'.format(INVALID_PATH)
with pytest.raises(IOError) as e:
requests.get(httpbin_secure(), cert=('.', INVALID_PATH))
- assert str(e.value) == 'Could not find the TLS key file, invalid path: {0}'.format(INVALID_PATH)
+ assert str(e.value) == 'Could not find the TLS key file, invalid path: {}'.format(INVALID_PATH)
def test_http_with_certificate(self, httpbin):
r = requests.get(httpbin(), cert='.')
@@ -823,10 +864,16 @@ def test_certificate_failure(self, httpbin_secure):
def test_urlencoded_get_query_multivalued_param(self, httpbin):
- r = requests.get(httpbin('get'), params=dict(test=['foo', 'baz']))
+ r = requests.get(httpbin('get'), params={'test': ['foo', 'baz']})
assert r.status_code == 200
assert r.url == httpbin('get?test=foo&test=baz')
+ def test_form_encoded_post_query_multivalued_element(self, httpbin):
+ r = requests.Request(method='POST', url=httpbin('post'),
+ data=dict(test=['foo', 'baz']))
+ prep = r.prepare()
+ assert prep.body == 'test=foo&test=baz'
+
def test_different_encodings_dont_break_post(self, httpbin):
r = requests.post(httpbin('post'),
data={'stuff': json.dumps({'a': 123})},
@@ -1131,6 +1178,14 @@ def test_cookie_duplicate_names_raises_cookie_conflict_error(self):
with pytest.raises(requests.cookies.CookieConflictError):
jar.get(key)
+ def test_cookie_policy_copy(self):
+ class MyCookiePolicy(cookielib.DefaultCookiePolicy):
+ pass
+
+ jar = requests.cookies.RequestsCookieJar()
+ jar.set_policy(MyCookiePolicy())
+ assert isinstance(jar.copy().get_policy(), MyCookiePolicy)
+
def test_time_elapsed_blank(self, httpbin):
r = requests.get(httpbin('get'))
td = r.elapsed
@@ -1351,6 +1406,44 @@ def test_transport_adapter_ordering(self):
assert 'http://' in s2.adapters
assert 'https://' in s2.adapters
+ def test_session_get_adapter_prefix_matching(self):
+ prefix = 'https://example.com'
+ more_specific_prefix = prefix + '/some/path'
+
+ url_matching_only_prefix = prefix + '/another/path'
+ url_matching_more_specific_prefix = more_specific_prefix + '/longer/path'
+ url_not_matching_prefix = 'https://another.example.com/'
+
+ s = requests.Session()
+ prefix_adapter = HTTPAdapter()
+ more_specific_prefix_adapter = HTTPAdapter()
+ s.mount(prefix, prefix_adapter)
+ s.mount(more_specific_prefix, more_specific_prefix_adapter)
+
+ assert s.get_adapter(url_matching_only_prefix) is prefix_adapter
+ assert s.get_adapter(url_matching_more_specific_prefix) is more_specific_prefix_adapter
+ assert s.get_adapter(url_not_matching_prefix) not in (prefix_adapter, more_specific_prefix_adapter)
+
+ def test_session_get_adapter_prefix_matching_mixed_case(self):
+ mixed_case_prefix = 'hTtPs://eXamPle.CoM/MixEd_CAse_PREfix'
+ url_matching_prefix = mixed_case_prefix + '/full_url'
+
+ s = requests.Session()
+ my_adapter = HTTPAdapter()
+ s.mount(mixed_case_prefix, my_adapter)
+
+ assert s.get_adapter(url_matching_prefix) is my_adapter
+
+ def test_session_get_adapter_prefix_matching_is_case_insensitive(self):
+ mixed_case_prefix = 'hTtPs://eXamPle.CoM/MixEd_CAse_PREfix'
+ url_matching_prefix_with_different_case = 'HtTpS://exaMPLe.cOm/MiXeD_caSE_preFIX/another_url'
+
+ s = requests.Session()
+ my_adapter = HTTPAdapter()
+ s.mount(mixed_case_prefix, my_adapter)
+
+ assert s.get_adapter(url_matching_prefix_with_different_case) is my_adapter
+
def test_header_remove_is_case_insensitive(self, httpbin):
# From issue #1321
s = requests.Session()
@@ -1365,7 +1458,7 @@ def test_params_are_merged_case_sensitive(self, httpbin):
assert r.json()['args'] == {'foo': 'bar', 'FOO': 'bar'}
def test_long_authinfo_in_url(self):
- url = 'http://{0}:{1}@{2}:9000/path?query#frag'.format(
+ url = 'http://{}:{}@{}:9000/path?query#frag'.format(
'E8A3BE87-9E3F-4620-8858-95478E385B5B',
'EA770032-DA4D-4D84-8CE9-29C6D910BF1E',
'exactly-------------sixty-----------three------------characters',
@@ -1480,15 +1573,15 @@ def test_nonhttp_schemes_dont_check_URLs(self):
preq = req.prepare()
assert test_url == preq.url
- @pytest.mark.xfail(raises=ConnectionError)
- def test_auth_is_stripped_on_redirect_off_host(self, httpbin):
+ def test_auth_is_stripped_on_http_downgrade(self, httpbin, httpbin_secure, httpbin_ca_bundle):
r = requests.get(
- httpbin('redirect-to'),
- params={'url': 'http://www.google.co.uk'},
+ httpbin_secure('redirect-to'),
+ params={'url': httpbin('get')},
auth=('user', 'pass'),
+ verify=httpbin_ca_bundle
)
assert r.history[0].request.headers['Authorization']
- assert not r.request.headers.get('Authorization', '')
+ assert 'Authorization' not in r.request.headers
def test_auth_is_retained_for_redirect_on_host(self, httpbin):
r = requests.get(httpbin('redirect/1'), auth=('user', 'pass'))
@@ -1497,6 +1590,27 @@ def test_auth_is_retained_for_redirect_on_host(self, httpbin):
assert h1 == h2
+ def test_should_strip_auth_host_change(self):
+ s = requests.Session()
+ assert s.should_strip_auth('http://example.com/foo', 'http://another.example.com/')
+
+ def test_should_strip_auth_http_downgrade(self):
+ s = requests.Session()
+ assert s.should_strip_auth('https://example.com/foo', 'http://example.com/bar')
+
+ def test_should_strip_auth_https_upgrade(self):
+ s = requests.Session()
+ assert not s.should_strip_auth('http://example.com/foo', 'https://example.com/bar')
+ assert not s.should_strip_auth('http://example.com:80/foo', 'https://example.com/bar')
+ assert not s.should_strip_auth('http://example.com/foo', 'https://example.com:443/bar')
+ # Non-standard ports should trigger stripping
+ assert s.should_strip_auth('http://example.com:8080/foo', 'https://example.com/bar')
+ assert s.should_strip_auth('http://example.com/foo', 'https://example.com:8443/bar')
+
+ def test_should_strip_auth_port_change(self):
+ s = requests.Session()
+ assert s.should_strip_auth('http://example.com:1234/foo', 'https://example.com:4321/bar')
+
def test_manual_redirect_with_partial_body_read(self, httpbin):
s = requests.Session()
r1 = s.get(httpbin('redirect/2'), allow_redirects=False, stream=True)
@@ -1518,13 +1632,11 @@ def test_manual_redirect_with_partial_body_read(self, httpbin):
def test_prepare_body_position_non_stream(self):
data = b'the data'
- s = requests.Session()
prep = requests.Request('GET', 'http://example.com', data=data).prepare()
assert prep._body_position is None
def test_rewind_body(self):
data = io.BytesIO(b'the data')
- s = requests.Session()
prep = requests.Request('GET', 'http://example.com', data=data).prepare()
assert prep._body_position == 0
assert prep.body.read() == b'the data'
@@ -1538,7 +1650,6 @@ def test_rewind_body(self):
def test_rewind_partially_read_body(self):
data = io.BytesIO(b'the data')
- s = requests.Session()
data.read(4) # read some data
prep = requests.Request('GET', 'http://example.com', data=data).prepare()
assert prep._body_position == 4
@@ -1563,7 +1674,6 @@ def __iter__(self):
return
data = BadFileObj('the data')
- s = requests.Session()
prep = requests.Request('GET', 'http://example.com', data=data).prepare()
assert prep._body_position == 0
@@ -1587,7 +1697,6 @@ def __iter__(self):
return
data = BadFileObj('the data')
- s = requests.Session()
prep = requests.Request('GET', 'http://example.com', data=data).prepare()
assert prep._body_position == 0
@@ -1608,7 +1717,6 @@ def __iter__(self):
return
data = BadFileObj('the data')
- s = requests.Session()
prep = requests.Request('GET', 'http://example.com', data=data).prepare()
assert prep._body_position is not None
@@ -1714,12 +1822,12 @@ def test_session_close_proxy_clear(self, mocker):
proxies['one'].clear.assert_called_once_with()
proxies['two'].clear.assert_called_once_with()
- def test_proxy_auth(self, httpbin):
+ def test_proxy_auth(self):
adapter = HTTPAdapter()
headers = adapter.proxy_headers("http://user:pass@httpbin.org")
assert headers == {'Proxy-Authorization': 'Basic dXNlcjpwYXNz'}
- def test_proxy_auth_empty_pass(self, httpbin):
+ def test_proxy_auth_empty_pass(self):
adapter = HTTPAdapter()
headers = adapter.proxy_headers("http://user:@httpbin.org")
assert headers == {'Proxy-Authorization': 'Basic dXNlcjo='}
@@ -2318,6 +2426,16 @@ def test_preparing_bad_url(self, url):
with pytest.raises(requests.exceptions.InvalidURL):
r.prepare()
+ @pytest.mark.parametrize(
+ 'url, exception',
+ (
+ ('http://localhost:-1', InvalidURL),
+ )
+ )
+ def test_redirecting_to_bad_url(self, httpbin, url, exception):
+ with pytest.raises(exception):
+ r = requests.get(httpbin('redirect-to'), params={'url': url})
+
@pytest.mark.parametrize(
'input, expected',
(
diff --git a/tests/test_testserver.py b/tests/test_testserver.py
index 3c770759c3..aac529261b 100644
--- a/tests/test_testserver.py
+++ b/tests/test_testserver.py
@@ -50,7 +50,7 @@ def test_text_response(self):
)
with server as (host, port):
- r = requests.get('http://{0}:{1}'.format(host, port))
+ r = requests.get('http://{}:{}'.format(host, port))
assert r.status_code == 200
assert r.text == u'roflol'
@@ -59,7 +59,7 @@ def test_text_response(self):
def test_basic_response(self):
"""the basic response server returns an empty http response"""
with Server.basic_response_server() as (host, port):
- r = requests.get('http://{0}:{1}'.format(host, port))
+ r = requests.get('http://{}:{}'.format(host, port))
assert r.status_code == 200
assert r.text == u''
assert r.headers['Content-Length'] == '0'
@@ -83,7 +83,7 @@ def test_multiple_requests(self):
server = Server.basic_response_server(requests_to_handle=requests_to_handle)
with server as (host, port):
- server_url = 'http://{0}:{1}'.format(host, port)
+ server_url = 'http://{}:{}'.format(host, port)
for _ in range(requests_to_handle):
r = requests.get(server_url)
assert r.status_code == 200
diff --git a/tests/test_utils.py b/tests/test_utils.py
index b3f398eed1..59b0b0efaf 100644
--- a/tests/test_utils.py
+++ b/tests/test_utils.py
@@ -2,15 +2,18 @@
import os
import copy
+import filecmp
from io import BytesIO
+import zipfile
+from collections import deque
import pytest
from requests import compat
from requests.cookies import RequestsCookieJar
from requests.structures import CaseInsensitiveDict
from requests.utils import (
- address_in_network, dotted_netmask,
- get_auth_from_url, get_encoding_from_headers,
+ address_in_network, dotted_netmask, extract_zipped_paths,
+ get_auth_from_url, _parse_content_type_header, get_encoding_from_headers,
get_encodings_from_content, get_environ_proxies,
guess_filename, guess_json_utf, is_ipv4_address,
is_valid_cidr, iter_slices, parse_dict_header,
@@ -256,6 +259,32 @@ def test_guess_filename_valid(self, value, expected_type):
assert isinstance(result, expected_type)
+class TestExtractZippedPaths:
+
+ @pytest.mark.parametrize(
+ 'path', (
+ '/',
+ __file__,
+ pytest.__file__,
+ '/etc/invalid/location',
+ ))
+ def test_unzipped_paths_unchanged(self, path):
+ assert path == extract_zipped_paths(path)
+
+ def test_zipped_paths_extracted(self, tmpdir):
+ zipped_py = tmpdir.join('test.zip')
+ with zipfile.ZipFile(zipped_py.strpath, 'w') as f:
+ f.write(__file__)
+
+ _, name = os.path.splitdrive(__file__)
+ zipped_path = os.path.join(zipped_py.strpath, name.lstrip(r'\/'))
+ extracted_path = extract_zipped_paths(zipped_path)
+
+ assert extracted_path != zipped_path
+ assert os.path.exists(extracted_path)
+ assert filecmp.cmp(extracted_path, __file__)
+
+
class TestContentEncodingDetection:
def test_none(self):
@@ -441,6 +470,49 @@ def test_parse_dict_header(value, expected):
assert parse_dict_header(value) == expected
+@pytest.mark.parametrize(
+ 'value, expected', (
+ (
+ 'application/xml',
+ ('application/xml', {})
+ ),
+ (
+ 'application/json ; charset=utf-8',
+ ('application/json', {'charset': 'utf-8'})
+ ),
+ (
+ 'application/json ; Charset=utf-8',
+ ('application/json', {'charset': 'utf-8'})
+ ),
+ (
+ 'text/plain',
+ ('text/plain', {})
+ ),
+ (
+ 'multipart/form-data; boundary = something ; boundary2=\'something_else\' ; no_equals ',
+ ('multipart/form-data', {'boundary': 'something', 'boundary2': 'something_else', 'no_equals': True})
+ ),
+ (
+ 'multipart/form-data; boundary = something ; boundary2="something_else" ; no_equals ',
+ ('multipart/form-data', {'boundary': 'something', 'boundary2': 'something_else', 'no_equals': True})
+ ),
+ (
+ 'multipart/form-data; boundary = something ; \'boundary2=something_else\' ; no_equals ',
+ ('multipart/form-data', {'boundary': 'something', 'boundary2': 'something_else', 'no_equals': True})
+ ),
+ (
+ 'multipart/form-data; boundary = something ; "boundary2=something_else" ; no_equals ',
+ ('multipart/form-data', {'boundary': 'something', 'boundary2': 'something_else', 'no_equals': True})
+ ),
+ (
+ 'application/json ; ; ',
+ ('application/json', {})
+ )
+ ))
+def test__parse_content_type_header(value, expected):
+ assert _parse_content_type_header(value) == expected
+
+
@pytest.mark.parametrize(
'value, expected', (
(
@@ -498,6 +570,10 @@ def test_iter_slices(value, length):
{'url': 'http://.../back.jpeg'}
]
),
+ (
+ '',
+ []
+ ),
))
def test_parse_header_links(value, expected):
assert parse_header_links(value) == expected
@@ -542,19 +618,41 @@ def test_urldefragauth(url, expected):
('http://172.16.1.1/', True),
('http://172.16.1.1:5000/', True),
('http://localhost.localdomain:5000/v1.0/', True),
+ ('http://google.com:6000/', True),
('http://172.16.1.12/', False),
('http://172.16.1.12:5000/', False),
('http://google.com:5000/v1.0/', False),
+ ('file:///some/path/on/disk', True),
))
def test_should_bypass_proxies(url, expected, monkeypatch):
"""Tests for function should_bypass_proxies to check if proxy
can be bypassed or not
"""
- monkeypatch.setenv('no_proxy', '192.168.0.0/24,127.0.0.1,localhost.localdomain,172.16.1.1')
- monkeypatch.setenv('NO_PROXY', '192.168.0.0/24,127.0.0.1,localhost.localdomain,172.16.1.1')
+ monkeypatch.setenv('no_proxy', '192.168.0.0/24,127.0.0.1,localhost.localdomain,172.16.1.1, google.com:6000')
+ monkeypatch.setenv('NO_PROXY', '192.168.0.0/24,127.0.0.1,localhost.localdomain,172.16.1.1, google.com:6000')
assert should_bypass_proxies(url, no_proxy=None) == expected
+@pytest.mark.parametrize(
+ 'url, expected', (
+ ('http://172.16.1.1/', '172.16.1.1'),
+ ('http://172.16.1.1:5000/', '172.16.1.1'),
+ ('http://user:pass@172.16.1.1', '172.16.1.1'),
+ ('http://user:pass@172.16.1.1:5000', '172.16.1.1'),
+ ('http://hostname/', 'hostname'),
+ ('http://hostname:5000/', 'hostname'),
+ ('http://user:pass@hostname', 'hostname'),
+ ('http://user:pass@hostname:5000', 'hostname'),
+ ))
+def test_should_bypass_proxies_pass_only_hostname(url, expected, mocker):
+ """The proxy_bypass function should be called with a hostname or IP without
+ a port number or auth credentials.
+ """
+ proxy_bypass = mocker.patch('requests.utils.proxy_bypass')
+ should_bypass_proxies(url, no_proxy=None)
+ proxy_bypass.assert_called_once_with(expected)
+
+
@pytest.mark.parametrize(
'cookiejar', (
compat.cookielib.CookieJar(),
@@ -567,7 +665,7 @@ def test_add_dict_to_cookiejar(cookiejar):
cookiedict = {'test': 'cookies',
'good': 'cookies'}
cj = add_dict_to_cookiejar(cookiejar, cookiedict)
- cookies = dict((cookie.name, cookie.value) for cookie in cj)
+ cookies = {cookie.name: cookie.value for cookie in cj}
assert cookiedict == cookies
@@ -634,6 +732,7 @@ def Close(self):
pass
ie_settings = RegHandle()
+ proxyEnableValues = deque([1, "1"])
def OpenKey(key, subkey):
return ie_settings
@@ -641,7 +740,9 @@ def OpenKey(key, subkey):
def QueryValueEx(key, value_name):
if key is ie_settings:
if value_name == 'ProxyEnable':
- return [1]
+ # this could be a string (REG_SZ) or a 32-bit number (REG_DWORD)
+ proxyEnableValues.rotate()
+ return [proxyEnableValues[0]]
elif value_name == 'ProxyOverride':
return [override]
@@ -652,6 +753,7 @@ def QueryValueEx(key, value_name):
monkeypatch.setenv('NO_PROXY', '')
monkeypatch.setattr(winreg, 'OpenKey', OpenKey)
monkeypatch.setattr(winreg, 'QueryValueEx', QueryValueEx)
+ assert should_bypass_proxies(url, None) == expected
@pytest.mark.parametrize(
diff --git a/tox.ini b/tox.ini
index 2a961c826a..47b68ba542 100644
--- a/tox.ini
+++ b/tox.ini
@@ -1,8 +1,8 @@
[tox]
-envlist = py26,py27,py33,py34,py35,py36
+envlist = py27,py34,py35,py36
[testenv]
commands =
pip install -e .[socks]
- python setup.py test
\ No newline at end of file
+ python setup.py test