Tag “ConnectionError”

Requests library is a de facto standard for handling HTTP in Python. Each time I have to write a crawler or REST API client, I know what to use. I have made a dozen of ones during the last couple of years. And each time I stumbled one frustrating thing. I mean requests.exceptions.ConnectionError which is unexpectedly raised with the message error(111, 'Connection refused') after 3–5 hours of client uptime, when remote service works well and stays available.

I don’t know for sure why it happens. I have a couple of versions, but essentially they all about unideal world we live in. Connection may die or hang. Highly loaded web server may refuse request. Packets may be lost. Long story short—shit happens. And when it happens, default Requests settings will not be enough.

So if you are going to make long-live process, which will use some services via Requests, you should change its default settings in this way:

from requests import Session
from requests.adapters import HTTPAdapter


session = Session()
session.mount('http://', HTTPAdapter(max_retries=5))
session.mount('https://', HTTPAdapter(max_retries=5))

HTTPAdapter performs only one try by default and raises ConnectionError on fail. I started from two tries, and empirically got that five ones gives 100% resistance against short-term downtimes.

I am not sure is it a bug or a feature of the Requests. But I never see that these default settings are changed in some Requests-based library like Twitter or Facebook API client. And I got such errors using these libraries too. So if you are using such library, examine its code. Now you know how to fix it. Thanks Python design, there are no true private members.

Unfortunately, I cannot reproduce this bug (if it is a real bug) in laboratorial conditions for now. So I will be grateful, if somebody suggests me how to.