I'm using the requests library to fetch a simple URL (I've put a dummy URL here, a normal URL is used in code):
import requests
response = requests.get("http://example.com/foo/bar/", headers={"User-Agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36"})
Locally it works fine, but when I put the same code on my server, this request takes forever to finish. I've enabled logging output for all of these loggers:
urllib3.util.retry
urllib3.util
urllib3
urllib3.connection
urllib3.response
urllib3.connectionpool
urllib3.poolmanager
requests
This is the only output produced by them:
2018-05-31 19:55:56,894 - urllib3.connectionpool - DEBUG - Starting new HTTP connection (1): example.com
2018-05-31 19:58:06,676 - urllib3.connectionpool - DEBUG - http://example.com:80 "GET /foo/bar/ HTTP/1.1" 200 None
The funny thing is that it always takes exactly 2 minutes and 10 seconds for the request to finish (if you disregard milliseconds). Locally it's instant.
Any clues where I should look next?