0s autopkgtest [14:43:31]: starting date and time: 2025-06-19 14:43:31+0000 0s autopkgtest [14:43:31]: git checkout: 9986aa8c Merge branch 'skia/fix_network_interface' into 'ubuntu/production' 0s autopkgtest [14:43:31]: host juju-7f2275-prod-proposed-migration-environment-21; command line: /home/ubuntu/autopkgtest/runner/autopkgtest --output-dir /tmp/autopkgtest-work.72_m58z1/out --timeout-copy=6000 --setup-commands /home/ubuntu/autopkgtest-cloud/worker-config-production/setup-canonical.sh --apt-pocket=proposed=src:requests --apt-upgrade pypuppetdb --timeout-short=300 --timeout-copy=20000 --timeout-build=20000 --env=ADT_TEST_TRIGGERS=requests/2.32.3+dfsg-5ubuntu2 -- ssh -s /home/ubuntu/autopkgtest/ssh-setup/nova -- --flavor autopkgtest-cpu2-ram4-disk20-ppc64el --security-groups autopkgtest-juju-7f2275-prod-proposed-migration-environment-21@sto01-ppc64el-15.secgroup --name adt-questing-ppc64el-pypuppetdb-20250619-144330-juju-7f2275-prod-proposed-migration-environment-21-0e8d970d-a213-4f22-84ec-9e8243d4143e --image adt/ubuntu-questing-ppc64el-server --keyname testbed-juju-7f2275-prod-proposed-migration-environment-21 --net-id=net_prod-autopkgtest-workers-ppc64el -e TERM=linux --mirror=http://ftpmaster.internal/ubuntu/ 126s autopkgtest [14:45:37]: testbed dpkg architecture: ppc64el 126s autopkgtest [14:45:37]: testbed apt version: 3.1.2 126s autopkgtest [14:45:37]: @@@@@@@@@@@@@@@@@@@@ test bed setup 126s autopkgtest [14:45:37]: testbed release detected to be: None 127s autopkgtest [14:45:38]: updating testbed package index (apt update) 127s Get:1 http://ftpmaster.internal/ubuntu questing-proposed InRelease [249 kB] 127s Hit:2 http://ftpmaster.internal/ubuntu questing InRelease 128s Hit:3 http://ftpmaster.internal/ubuntu questing-updates InRelease 128s Hit:4 http://ftpmaster.internal/ubuntu questing-security InRelease 128s Get:5 http://ftpmaster.internal/ubuntu questing-proposed/main Sources [38.3 kB] 128s Get:6 http://ftpmaster.internal/ubuntu questing-proposed/multiverse Sources [17.4 kB] 128s Get:7 http://ftpmaster.internal/ubuntu questing-proposed/restricted Sources [4716 B] 128s Get:8 http://ftpmaster.internal/ubuntu questing-proposed/universe Sources [426 kB] 128s Get:9 http://ftpmaster.internal/ubuntu questing-proposed/main ppc64el Packages [66.7 kB] 128s Get:10 http://ftpmaster.internal/ubuntu questing-proposed/restricted ppc64el Packages [724 B] 128s Get:11 http://ftpmaster.internal/ubuntu questing-proposed/universe ppc64el Packages [340 kB] 128s Get:12 http://ftpmaster.internal/ubuntu questing-proposed/multiverse ppc64el Packages [6448 B] 128s Fetched 1149 kB in 0s (2411 kB/s) 129s Reading package lists... 129s autopkgtest [14:45:40]: upgrading testbed (apt dist-upgrade and autopurge) 129s Reading package lists... 130s Building dependency tree... 130s Reading state information... 130s Calculating upgrade... 130s The following packages will be upgraded: 130s python3-requests 130s 1 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 130s Need to get 53.1 kB of archives. 130s After this operation, 0 B of additional disk space will be used. 130s Get:1 http://ftpmaster.internal/ubuntu questing-proposed/main ppc64el python3-requests all 2.32.3+dfsg-5ubuntu2 [53.1 kB] 130s Fetched 53.1 kB in 0s (4530 kB/s) 131s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 79652 files and directories currently installed.) 131s Preparing to unpack .../python3-requests_2.32.3+dfsg-5ubuntu2_all.deb ... 131s Unpacking python3-requests (2.32.3+dfsg-5ubuntu2) over (2.32.3+dfsg-5ubuntu1) ... 131s Setting up python3-requests (2.32.3+dfsg-5ubuntu2) ... 131s Reading package lists... 131s Building dependency tree... 131s Reading state information... 132s Solving dependencies... 132s 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 134s autopkgtest [14:45:45]: testbed running kernel: Linux 6.14.0-15-generic #15-Ubuntu SMP Sun Apr 6 14:52:42 UTC 2025 134s autopkgtest [14:45:45]: @@@@@@@@@@@@@@@@@@@@ apt-source pypuppetdb 135s Get:1 http://ftpmaster.internal/ubuntu questing/universe pypuppetdb 3.2.0-1 (dsc) [2327 B] 135s Get:2 http://ftpmaster.internal/ubuntu questing/universe pypuppetdb 3.2.0-1 (tar) [47.7 kB] 135s Get:3 http://ftpmaster.internal/ubuntu questing/universe pypuppetdb 3.2.0-1 (diff) [3756 B] 135s gpgv: Signature made Sat Jan 20 14:58:34 2024 UTC 135s gpgv: using RSA key 8F6DE104377F3B11E741748731F3144544A1741A 135s gpgv: issuer "tchet@debian.org" 135s gpgv: Can't check signature: No public key 135s dpkg-source: warning: cannot verify inline signature for ./pypuppetdb_3.2.0-1.dsc: no acceptable signature found 135s autopkgtest [14:45:46]: testing package pypuppetdb version 3.2.0-1 135s autopkgtest [14:45:46]: build not needed 136s autopkgtest [14:45:47]: test unittests: preparing testbed 136s Reading package lists... 136s Building dependency tree... 136s Reading state information... 136s Solving dependencies... 136s The following NEW packages will be installed: 136s git git-man liberror-perl python3-all python3-bandit python3-git 136s python3-gitdb python3-httpretty python3-importlib-metadata python3-iniconfig 136s python3-jschema-to-python python3-jsonpickle python3-mypy 136s python3-mypy-extensions python3-pbr python3-pluggy python3-psutil 136s python3-pypuppetdb python3-pytest python3-sarif-python-om python3-smmap 136s python3-stevedore 136s 0 upgraded, 22 newly installed, 0 to remove and 0 not upgraded. 136s Need to get 18.4 MB of archives. 136s After this operation, 101 MB of additional disk space will be used. 136s Get:1 http://ftpmaster.internal/ubuntu questing/main ppc64el liberror-perl all 0.17030-1 [23.5 kB] 136s Get:2 http://ftpmaster.internal/ubuntu questing/main ppc64el git-man all 1:2.48.1-0ubuntu1 [1148 kB] 136s Get:3 http://ftpmaster.internal/ubuntu questing/main ppc64el git ppc64el 1:2.48.1-0ubuntu1 [7338 kB] 137s Get:4 http://ftpmaster.internal/ubuntu questing/main ppc64el python3-all ppc64el 3.13.4-1 [880 B] 137s Get:5 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-smmap all 6.0.0-1 [20.5 kB] 137s Get:6 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-gitdb all 4.0.12-1 [46.4 kB] 137s Get:7 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-git all 3.1.44-1 [148 kB] 137s Get:8 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-httpretty all 1.1.4-4 [23.1 kB] 137s Get:9 http://ftpmaster.internal/ubuntu questing/main ppc64el python3-importlib-metadata all 8.7.0-2 [21.0 kB] 137s Get:10 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-iniconfig all 1.1.1-2 [6024 B] 137s Get:11 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-jsonpickle all 4.0.2+dfsg-2 [40.1 kB] 137s Get:12 http://ftpmaster.internal/ubuntu questing/main ppc64el python3-pbr all 6.1.1-0ubuntu1 [58.2 kB] 137s Get:13 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-jschema-to-python all 1.2.3-3 [8120 B] 137s Get:14 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-mypy-extensions all 1.0.0-1 [6148 B] 137s Get:15 http://ftpmaster.internal/ubuntu questing/main ppc64el python3-psutil ppc64el 5.9.8-2build3 [197 kB] 137s Get:16 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-mypy ppc64el 1.15.0-5 [8881 kB] 137s Get:17 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-pluggy all 1.5.0-1 [21.0 kB] 137s Get:18 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-pypuppetdb all 3.2.0-1 [28.9 kB] 137s Get:19 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-pytest all 8.3.5-2 [252 kB] 137s Get:20 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-sarif-python-om all 1.0.4-3 [12.5 kB] 137s Get:21 http://ftpmaster.internal/ubuntu questing/main ppc64el python3-stevedore all 1:5.4.1-0ubuntu1 [21.2 kB] 137s Get:22 http://ftpmaster.internal/ubuntu questing/universe ppc64el python3-bandit all 1.7.10-2 [75.1 kB] 137s Fetched 18.4 MB in 1s (26.8 MB/s) 137s Selecting previously unselected package liberror-perl. 137s (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 79652 files and directories currently installed.) 137s Preparing to unpack .../00-liberror-perl_0.17030-1_all.deb ... 137s Unpacking liberror-perl (0.17030-1) ... 137s Selecting previously unselected package git-man. 137s Preparing to unpack .../01-git-man_1%3a2.48.1-0ubuntu1_all.deb ... 137s Unpacking git-man (1:2.48.1-0ubuntu1) ... 137s Selecting previously unselected package git. 137s Preparing to unpack .../02-git_1%3a2.48.1-0ubuntu1_ppc64el.deb ... 137s Unpacking git (1:2.48.1-0ubuntu1) ... 137s Selecting previously unselected package python3-all. 137s Preparing to unpack .../03-python3-all_3.13.4-1_ppc64el.deb ... 137s Unpacking python3-all (3.13.4-1) ... 137s Selecting previously unselected package python3-smmap. 137s Preparing to unpack .../04-python3-smmap_6.0.0-1_all.deb ... 137s Unpacking python3-smmap (6.0.0-1) ... 137s Selecting previously unselected package python3-gitdb. 137s Preparing to unpack .../05-python3-gitdb_4.0.12-1_all.deb ... 137s Unpacking python3-gitdb (4.0.12-1) ... 137s Selecting previously unselected package python3-git. 137s Preparing to unpack .../06-python3-git_3.1.44-1_all.deb ... 137s Unpacking python3-git (3.1.44-1) ... 137s Selecting previously unselected package python3-httpretty. 137s Preparing to unpack .../07-python3-httpretty_1.1.4-4_all.deb ... 137s Unpacking python3-httpretty (1.1.4-4) ... 137s Selecting previously unselected package python3-importlib-metadata. 137s Preparing to unpack .../08-python3-importlib-metadata_8.7.0-2_all.deb ... 137s Unpacking python3-importlib-metadata (8.7.0-2) ... 138s Selecting previously unselected package python3-iniconfig. 138s Preparing to unpack .../09-python3-iniconfig_1.1.1-2_all.deb ... 138s Unpacking python3-iniconfig (1.1.1-2) ... 138s Selecting previously unselected package python3-jsonpickle. 138s Preparing to unpack .../10-python3-jsonpickle_4.0.2+dfsg-2_all.deb ... 138s Unpacking python3-jsonpickle (4.0.2+dfsg-2) ... 138s Selecting previously unselected package python3-pbr. 138s Preparing to unpack .../11-python3-pbr_6.1.1-0ubuntu1_all.deb ... 138s Unpacking python3-pbr (6.1.1-0ubuntu1) ... 138s Selecting previously unselected package python3-jschema-to-python. 138s Preparing to unpack .../12-python3-jschema-to-python_1.2.3-3_all.deb ... 138s Unpacking python3-jschema-to-python (1.2.3-3) ... 138s Selecting previously unselected package python3-mypy-extensions. 138s Preparing to unpack .../13-python3-mypy-extensions_1.0.0-1_all.deb ... 138s Unpacking python3-mypy-extensions (1.0.0-1) ... 138s Selecting previously unselected package python3-psutil. 138s Preparing to unpack .../14-python3-psutil_5.9.8-2build3_ppc64el.deb ... 138s Unpacking python3-psutil (5.9.8-2build3) ... 138s Selecting previously unselected package python3-mypy. 138s Preparing to unpack .../15-python3-mypy_1.15.0-5_ppc64el.deb ... 138s Unpacking python3-mypy (1.15.0-5) ... 138s Selecting previously unselected package python3-pluggy. 138s Preparing to unpack .../16-python3-pluggy_1.5.0-1_all.deb ... 138s Unpacking python3-pluggy (1.5.0-1) ... 138s Selecting previously unselected package python3-pypuppetdb. 138s Preparing to unpack .../17-python3-pypuppetdb_3.2.0-1_all.deb ... 138s Unpacking python3-pypuppetdb (3.2.0-1) ... 138s Selecting previously unselected package python3-pytest. 138s Preparing to unpack .../18-python3-pytest_8.3.5-2_all.deb ... 138s Unpacking python3-pytest (8.3.5-2) ... 138s Selecting previously unselected package python3-sarif-python-om. 138s Preparing to unpack .../19-python3-sarif-python-om_1.0.4-3_all.deb ... 138s Unpacking python3-sarif-python-om (1.0.4-3) ... 138s Selecting previously unselected package python3-stevedore. 138s Preparing to unpack .../20-python3-stevedore_1%3a5.4.1-0ubuntu1_all.deb ... 138s Unpacking python3-stevedore (1:5.4.1-0ubuntu1) ... 138s Selecting previously unselected package python3-bandit. 138s Preparing to unpack .../21-python3-bandit_1.7.10-2_all.deb ... 138s Unpacking python3-bandit (1.7.10-2) ... 138s Setting up python3-iniconfig (1.1.1-2) ... 138s Setting up python3-httpretty (1.1.4-4) ... 138s Setting up python3-jsonpickle (4.0.2+dfsg-2) ... 138s Setting up python3-importlib-metadata (8.7.0-2) ... 138s Setting up python3-pbr (6.1.1-0ubuntu1) ... 139s Setting up python3-mypy-extensions (1.0.0-1) ... 139s Setting up python3-all (3.13.4-1) ... 139s Setting up python3-psutil (5.9.8-2build3) ... 139s Setting up liberror-perl (0.17030-1) ... 139s Setting up python3-sarif-python-om (1.0.4-3) ... 139s Setting up python3-jschema-to-python (1.2.3-3) ... 139s Setting up python3-mypy (1.15.0-5) ... 140s Setting up python3-pluggy (1.5.0-1) ... 140s Setting up python3-stevedore (1:5.4.1-0ubuntu1) ... 140s Setting up git-man (1:2.48.1-0ubuntu1) ... 140s Setting up python3-pypuppetdb (3.2.0-1) ... 141s Setting up python3-smmap (6.0.0-1) ... 141s Setting up python3-pytest (8.3.5-2) ... 141s Setting up python3-gitdb (4.0.12-1) ... 141s Setting up git (1:2.48.1-0ubuntu1) ... 141s Setting up python3-git (3.1.44-1) ... 141s Setting up python3-bandit (1.7.10-2) ... 141s Processing triggers for man-db (2.13.1-1) ... 144s autopkgtest [14:45:55]: test unittests: [----------------------- 144s ************************************************************************** 144s # A new feature in cloud-init identified possible datasources for # 144s # this system as: # 144s # [] # 144s # However, the datasource used was: OpenStack # 144s # # 144s # In the future, cloud-init will only attempt to use datasources that # 144s # are identified or specifically configured. # 144s # For more information see # 144s # https://bugs.launchpad.net/bugs/1669675 # 144s # # 144s # If you are seeing this message, please file a bug against # 144s # cloud-init at # 144s # https://github.com/canonical/cloud-init/issues # 144s # Make sure to include the cloud provider your instance is # 144s # running on. # 144s # # 144s # After you have filed a bug, you can disable this warning by launching # 144s # your instance with the cloud-config below, or putting that content # 144s # into /etc/cloud/cloud.cfg.d/99-warnings.cfg # 144s # # 144s # #cloud-config # 144s # warnings: # 144s # dsid_missing_source: off # 144s ************************************************************************** 144s 144s Disable the warnings above by: 144s touch /home/ubuntu/.cloud-warnings.skip 144s or 144s touch /var/lib/cloud/instance/warnings/.skip 144s === python3.13 === 145s ============================= test session starts ============================== 145s platform linux -- Python 3.13.5, pytest-8.3.5, pluggy-1.5.0 145s rootdir: /tmp/autopkgtest.Ywp3nR/autopkgtest_tmp 145s plugins: typeguard-4.4.2 145s collected 163 items 145s 145s tests/test_api_base_init.py ............................ [ 17%] 148s tests/test_api_base_query.py ...FFFFFFFFFFFFFFFFFF.FF [ 31%] 148s tests/test_api_base_url.py .... [ 34%] 149s tests/test_api_metrics.py FFFFFFFFF. [ 40%] 149s tests/test_api_other.py FFF.FF [ 44%] 149s tests/test_api_pql.py FF..... [ 48%] 150s tests/test_api_query.py FF.FFF [ 52%] 150s tests/test_connect.py .. [ 53%] 150s tests/test_querybuilder.py ...................................... [ 76%] 150s tests/test_types.py .............................. [ 95%] 150s tests/test_utils.py ........ [100%] 150s 150s =================================== FAILURES =================================== 150s _____________ TestBaseAPIQuery.test_setting_headers_without_token ______________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes', body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/nodes', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_setting_headers_without_token(self, api): 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/query/v4/nodes") 150s > api._query("nodes") # need to query some endpoint 150s 150s tests/test_api_base_query.py:59: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(221 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/nodes via GET at 1750344355.2034147 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s _______________ TestBaseAPIQuery.test_setting_headers_with_token _______________ 150s 150s self = 150s method = 'GET', url = '/pdb/query/v4/nodes', body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8', 'X-Authentication': 'tokenstring'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/pdb/query/v4/nodes', query=None, fragment=None) 150s destination_scheme = None, conn = None, release_this_conn = True 150s http_tunnel_required = True, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s > self._prepare_proxy(conn) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:773: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:1042: in _prepare_proxy 150s conn.connect() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:721: in connect 150s self._tunnel() 150s /usr/lib/python3.13/http/client.py:971: in _tunnel 150s (version, code, message) = response._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:474: in increment 150s raise reraise(type(error), error, _stacktrace) 150s /usr/lib/python3/dist-packages/urllib3/util/util.py:38: in reraise 150s raise value.with_traceback(tb) 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:773: in urlopen 150s self._prepare_proxy(conn) 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:1042: in _prepare_proxy 150s conn.connect() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:721: in connect 150s self._tunnel() 150s /usr/lib/python3.13/http/client.py:971: in _tunnel 150s (version, code, message) = response._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response')) 150s 150s /usr/lib/python3.13/http/client.py:300: ProtocolError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s token_api = 150s 150s def test_setting_headers_with_token(self, token_api): 150s httpretty.enable() 150s stub_request("https://localhost:8080/pdb/query/v4/nodes") 150s > token_api._query("nodes") # need to query some endpoint 150s 150s tests/test_api_base_query.py:73: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s > raise ConnectionError(err, request=request) 150s E requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response')) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:682: ConnectionError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(57 bytes) to ://localhost:8080localhost:8080 via CONNECT at 1750344355.3507786 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTPS. 150s _______________________ TestBaseAPIQuery.test_with_path ________________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes/node1' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/nodes/node1', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes/node1' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes/node1 (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_with_path(self, api): 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/query/v4/nodes/node1") 150s > api._query("nodes", path="node1") 150s 150s tests/test_api_base_query.py:92: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes/node1 (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(227 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/nodes/node1 via GET at 1750344355.497806 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s _____________________ TestBaseAPIQuery.test_with_url_path ______________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/puppetdb/pdb/query/v4/nodes' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/puppetdb/pdb/query/v4/nodes', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/puppetdb/pdb/query/v4/nodes' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/puppetdb/pdb/query/v4/nodes (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_with_url_path(self, api): 150s httpretty.enable() 150s stub_request("http://localhost:8080/puppetdb/pdb/query/v4/nodes") 150s api.url_path = "/puppetdb" 150s > api._query("nodes") 150s 150s tests/test_api_base_query.py:101: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/puppetdb/pdb/query/v4/nodes (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(230 bytes) to ://localhost:8080http://localhost:8080/puppetdb/pdb/query/v4/nodes via GET at 1750344355.616557 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ______________ TestBaseAPIQuery.test_with_password_authorization _______________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes', body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8', 'Authorization': 'Basic cHVwcGV0ZGI6cGFzc3dvcmQxMjM='} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/nodes', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_with_password_authorization(self, api): 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/query/v4/nodes") 150s api.session.auth = ("puppetdb", "password123") 150s > api._query("nodes") 150s 150s tests/test_api_base_query.py:110: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(272 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/nodes via GET at 1750344355.7432609 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ________________ TestBaseAPIQuery.test_with_token_authorization ________________ 150s 150s self = 150s method = 'GET', url = '/pdb/query/v4/nodes', body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8', 'X-Authentication': 'tokenstring'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/pdb/query/v4/nodes', query=None, fragment=None) 150s destination_scheme = None, conn = None, release_this_conn = True 150s http_tunnel_required = True, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s > self._prepare_proxy(conn) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:773: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:1042: in _prepare_proxy 150s conn.connect() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:721: in connect 150s self._tunnel() 150s /usr/lib/python3.13/http/client.py:971: in _tunnel 150s (version, code, message) = response._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:474: in increment 150s raise reraise(type(error), error, _stacktrace) 150s /usr/lib/python3/dist-packages/urllib3/util/util.py:38: in reraise 150s raise value.with_traceback(tb) 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:773: in urlopen 150s self._prepare_proxy(conn) 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:1042: in _prepare_proxy 150s conn.connect() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:721: in connect 150s self._tunnel() 150s /usr/lib/python3.13/http/client.py:971: in _tunnel 150s (version, code, message) = response._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response')) 150s 150s /usr/lib/python3.13/http/client.py:300: ProtocolError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s token_api = 150s 150s def test_with_token_authorization(self, token_api): 150s httpretty.enable() 150s stub_request("https://localhost:8080/pdb/query/v4/nodes") 150s > token_api._query("nodes") 150s 150s tests/test_api_base_query.py:124: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s > raise ConnectionError(err, request=request) 150s E requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response')) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:682: ConnectionError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(57 bytes) to ://localhost:8080localhost:8080 via CONNECT at 1750344355.857601 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTPS. 150s _______________________ TestBaseAPIQuery.test_with_query _______________________ 150s 150s self = 150s method = 'GET' 150s url = 'http://localhost:8080/pdb/query/v4/nodes?query=%5B%22certname%22%2C+%22%3D%22%2C+%22node1%22%5D' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/nodes', query='query=%5B%22certname%22%2C+%22%3D%22%2C+%22node1%22%5D', fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET' 150s url = 'http://localhost:8080/pdb/query/v4/nodes?query=%5B%22certname%22%2C+%22%3D%22%2C+%22node1%22%5D' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?query=%5B%22certname%22%2C+%22%3D%22%2C+%22node1%22%5D (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_with_query(self, api): 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/query/v4/nodes") 150s > api._query("nodes", query='["certname", "=", "node1"]') 150s 150s tests/test_api_base_query.py:131: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?query=%5B%22certname%22%2C+%22%3D%22%2C+%22node1%22%5D (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(276 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/nodes?query=%5B%22certname%22%2C+%22%3D%22%2C+%22node1%22%5D via GET at 1750344356.015201 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s _______________________ TestBaseAPIQuery.test_with_order _______________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes?order_by=ted' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/nodes', query='order_by=ted', fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes?order_by=ted' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?order_by=ted (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_with_order(self, api): 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/query/v4/nodes") 150s > api._query("nodes", order_by="ted") 150s 150s tests/test_api_base_query.py:141: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?order_by=ted (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(234 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/nodes?order_by=ted via GET at 1750344356.1283627 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s _______________________ TestBaseAPIQuery.test_with_limit _______________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes?limit=1' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/nodes', query='limit=1', fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes?limit=1' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?limit=1 (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_with_limit(self, api): 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/query/v4/nodes") 150s > api._query("nodes", limit=1) 150s 150s tests/test_api_base_query.py:149: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?limit=1 (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(229 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/nodes?limit=1 via GET at 1750344356.2438304 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ___________________ TestBaseAPIQuery.test_with_include_total ___________________ 150s 150s self = 150s method = 'GET' 150s url = 'http://localhost:8080/pdb/query/v4/nodes?include_total=true', body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/nodes', query='include_total=true', fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET' 150s url = 'http://localhost:8080/pdb/query/v4/nodes?include_total=true' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?include_total=true (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_with_include_total(self, api): 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/query/v4/nodes") 150s > api._query("nodes", include_total=True) 150s 150s tests/test_api_base_query.py:157: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?include_total=true (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(240 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/nodes?include_total=true via GET at 1750344356.3564203 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ______________________ TestBaseAPIQuery.test_with_offset _______________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes?offset=1' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/nodes', query='offset=1', fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes?offset=1' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?offset=1 (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_with_offset(self, api): 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/query/v4/nodes") 150s > api._query("nodes", offset=1) 150s 150s tests/test_api_base_query.py:165: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?offset=1 (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(230 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/nodes?offset=1 via GET at 1750344356.4710608 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ___________________ TestBaseAPIQuery.test_with_summarize_by ____________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes?summarize_by=1' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/nodes', query='summarize_by=1', fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes?summarize_by=1' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?summarize_by=1 (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_with_summarize_by(self, api): 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/query/v4/nodes") 150s > api._query("nodes", summarize_by=1) 150s 150s tests/test_api_base_query.py:173: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?summarize_by=1 (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(236 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/nodes?summarize_by=1 via GET at 1750344356.5864673 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s _____________________ TestBaseAPIQuery.test_with_count_by ______________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes?count_by=1' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/nodes', query='count_by=1', fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes?count_by=1' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?count_by=1 (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_with_count_by(self, api): 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/query/v4/nodes") 150s > api._query("nodes", count_by=1) 150s 150s tests/test_api_base_query.py:181: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?count_by=1 (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(232 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/nodes?count_by=1 via GET at 1750344356.6985927 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ___________________ TestBaseAPIQuery.test_with_count_filter ____________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes?counts_filter=1' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/nodes', query='counts_filter=1', fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes?counts_filter=1' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?counts_filter=1 (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_with_count_filter(self, api): 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/query/v4/nodes") 150s > api._query("nodes", count_filter=1) 150s 150s tests/test_api_base_query.py:189: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?counts_filter=1 (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(237 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/nodes?counts_filter=1 via GET at 1750344356.8114595 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ____________________ TestBaseAPIQuery.test_with_payload_get ____________________ 150s 150s self = 150s method = 'GET' 150s url = 'http://localhost:8080/pdb/query/v4/nodes?foo=bar&count_by=1', body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/nodes', query='foo=bar&count_by=1', fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET' 150s url = 'http://localhost:8080/pdb/query/v4/nodes?foo=bar&count_by=1' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?foo=bar&count_by=1 (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_with_payload_get(self, api): 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/query/v4/nodes") 150s > api._query("nodes", payload={"foo": "bar"}, count_by=1) 150s 150s tests/test_api_base_query.py:197: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?foo=bar&count_by=1 (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(240 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/nodes?foo=bar&count_by=1 via GET at 1750344356.927442 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ___________________ TestBaseAPIQuery.test_with_payload_post ____________________ 150s 150s self = 150s method = 'POST', url = 'http://localhost:8080/pdb/query/v4/nodes' 150s body = '{"foo": "bar", "count_by": 1}' 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8', 'Content-Length': '29'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/nodes', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'POST', url = 'http://localhost:8080/pdb/query/v4/nodes' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_with_payload_post(self, api): 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/query/v4/nodes", method=httpretty.POST) 150s > api._query("nodes", payload={"foo": "bar"}, count_by=1, request_method="POST") 150s 150s tests/test_api_base_query.py:208: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:418: in _make_request 150s r = self.session.post( 150s /usr/lib/python3/dist-packages/requests/sessions.py:637: in post 150s return self.request("POST", url, data=data, json=json, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(242 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/nodes via POST at 1750344357.0397334 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s _____________________ TestBaseAPIQuery.test_response_empty _____________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes', body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/nodes', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/nodes' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_response_empty(self, api): 150s httpretty.enable() 150s httpretty.register_uri( 150s httpretty.GET, 150s "http://localhost:8080/pdb/query/v4/nodes", 150s body=json.dumps(None), 150s ) 150s with pytest.raises(pypuppetdb.errors.EmptyResponseError): 150s > api._query("nodes") 150s 150s tests/test_api_base_query.py:223: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(221 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/nodes via GET at 1750344357.152794 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ___________________ TestBaseAPIQuery.test_response_x_records ___________________ 150s 150s self = 150s method = 'GET' 150s url = 'http://localhost:8080/pdb/query/v4/nodes?include_total=true', body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/nodes', query='include_total=true', fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET' 150s url = 'http://localhost:8080/pdb/query/v4/nodes?include_total=true' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?include_total=true (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_response_x_records(self, api): 150s httpretty.enable() 150s httpretty.register_uri( 150s httpretty.GET, 150s "http://localhost:8080/pdb/query/v4/nodes", 150s adding_headers={"X-Records": 256}, 150s body="[]", 150s ) 150s > api._query("nodes", include_total=True) 150s 150s tests/test_api_base_query.py:233: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?include_total=true (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(240 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/nodes?include_total=true via GET at 1750344357.267481 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ________________ TestBaseAPIQuery.test_query_with_post[string] _________________ 150s 150s self = 150s method = 'POST', url = 'http://localhost:8080/pdb/query/v4/nodes' 150s body = '{"query": "[\\"certname\\", \\"=\\", \\"node1\\"]", "count_by": 1}' 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8', 'Content-Length': '60'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/nodes', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'POST', url = 'http://localhost:8080/pdb/query/v4/nodes' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s query = '["certname", "=", "node1"]' 150s 150s def test_query_with_post(self, api, query): 150s httpretty.reset() 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/query/v4/nodes", method=httpretty.POST) 150s > api._query("nodes", query=query, count_by=1, request_method="POST") 150s 150s tests/test_api_base_query.py:250: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:418: in _make_request 150s r = self.session.post( 150s /usr/lib/python3/dist-packages/requests/sessions.py:637: in post 150s return self.request("POST", url, data=data, json=json, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(242 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/nodes via POST at 1750344357.381169 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s _____________ TestBaseAPIQuery.test_query_with_post[QueryBuilder] ______________ 150s 150s self = 150s method = 'POST', url = 'http://localhost:8080/pdb/query/v4/nodes' 150s body = '{"query": "[\\"=\\", \\"certname\\", \\"node1\\"]", "count_by": 1}' 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8', 'Content-Length': '60'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/nodes', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'POST', url = 'http://localhost:8080/pdb/query/v4/nodes' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s query = Query: ["=", "certname", "node1"] 150s 150s def test_query_with_post(self, api, query): 150s httpretty.reset() 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/query/v4/nodes", method=httpretty.POST) 150s > api._query("nodes", query=query, count_by=1, request_method="POST") 150s 150s tests/test_api_base_query.py:250: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:418: in _make_request 150s r = self.session.post( 150s /usr/lib/python3/dist-packages/requests/sessions.py:637: in post 150s return self.request("POST", url, data=data, json=json, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(242 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/nodes via POST at 1750344357.4948344 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ________________________ TestMetricsAPI.test_metric_v1 _________________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/metrics/v1/mbeans/test' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/metrics/v1/mbeans/test', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/metrics/v1/mbeans/test' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/metrics/v1/mbeans/test (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_metric_v1(self, api): 150s httpretty.enable() 150s httpretty.enable() 150s stub_request("http://localhost:8080/metrics/v1/mbeans/test") 150s > api.metric("test", version="v1") 150s 150s tests/test_api_metrics.py:23: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/metrics.py:37: in metric 150s return self._query("mbean", path=metric) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/metrics/v1/mbeans/test (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(225 bytes) to ://localhost:8080http://localhost:8080/metrics/v1/mbeans/test via GET at 1750344357.626448 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ______________________ TestMetricsAPI.test_metric_v1_list ______________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/metrics/v1/mbeans', body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/metrics/v1/mbeans', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/metrics/v1/mbeans', response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/metrics/v1/mbeans (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_metric_v1_list(self, api): 150s httpretty.enable() 150s httpretty.enable() 150s stub_request("http://localhost:8080/metrics/v1/mbeans") 150s > api.metric(version="v1") 150s 150s tests/test_api_metrics.py:30: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/metrics.py:37: in metric 150s return self._query("mbean", path=metric) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/metrics/v1/mbeans (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(220 bytes) to ://localhost:8080http://localhost:8080/metrics/v1/mbeans via GET at 1750344357.7378209 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ______________ TestMetricsAPI.test_metric_v1_version_constructor _______________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/metrics/v1/mbeans/test' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/metrics/v1/mbeans/test', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/metrics/v1/mbeans/test' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/metrics/v1/mbeans/test (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s 150s def test_metric_v1_version_constructor(self): 150s api = pypuppetdb.api.API(metric_api_version="v1") 150s httpretty.enable() 150s httpretty.enable() 150s stub_request("http://localhost:8080/metrics/v1/mbeans/test") 150s > api.metric("test") 150s 150s tests/test_api_metrics.py:38: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/metrics.py:37: in metric 150s return self._query("mbean", path=metric) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/metrics/v1/mbeans/test (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(225 bytes) to ://localhost:8080http://localhost:8080/metrics/v1/mbeans/test via GET at 1750344357.8492033 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ________________________ TestMetricsAPI.test_metric_v2 _________________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/metrics/v2/read/test%3Aname%3DNum' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/metrics/v2/read/test%3Aname%3DNum', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/metrics/v2/read/test%3Aname%3DNum' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/metrics/v2/read/test%3Aname%3DNum (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_metric_v2(self, api): 150s metrics_body = { 150s "request": {"mbean": "test:name=Num", "type": "read"}, 150s "value": {"Value": 0}, 150s "timestamp": 0, 150s "status": 200, 150s } 150s 150s httpretty.enable() 150s httpretty.register_uri( 150s httpretty.GET, 150s "http://localhost:8080/metrics/v2/read/test:name=Num", 150s body=json.dumps(metrics_body), 150s ) 150s > metric = api.metric("test:name=Num") 150s 150s tests/test_api_metrics.py:55: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/metrics.py:31: in metric 150s res = self._query("metrics", path=self._escape_metric_name(metric)) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/metrics/v2/read/test%3Aname%3DNum (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(236 bytes) to ://localhost:8080http://localhost:8080/metrics/v2/read/test%3Aname%3DNum via GET at 1750344357.9636693 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ______________ TestMetricsAPI.test_metric_v2_version_constructor _______________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/metrics/v2/read/test%3Aname%3DNum' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/metrics/v2/read/test%3Aname%3DNum', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/metrics/v2/read/test%3Aname%3DNum' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/metrics/v2/read/test%3Aname%3DNum (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s 150s def test_metric_v2_version_constructor(self): 150s api = pypuppetdb.api.API(metric_api_version="v2") 150s metrics_body = { 150s "request": {"mbean": "test:name=Num", "type": "read"}, 150s "value": {"Value": 0}, 150s "timestamp": 0, 150s "status": 200, 150s } 150s 150s httpretty.enable() 150s httpretty.register_uri( 150s httpretty.GET, 150s "http://localhost:8080/metrics/v2/read/test:name=Num", 150s body=json.dumps(metrics_body), 150s ) 150s > metric = api.metric("test:name=Num") 150s 150s tests/test_api_metrics.py:76: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/metrics.py:31: in metric 150s res = self._query("metrics", path=self._escape_metric_name(metric)) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/metrics/v2/read/test%3Aname%3DNum (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(236 bytes) to ://localhost:8080http://localhost:8080/metrics/v2/read/test%3Aname%3DNum via GET at 1750344358.0749812 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s _________________ TestMetricsAPI.test_metric_v2_version_string _________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/metrics/v2/read/test%3Aname%3DNum' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/metrics/v2/read/test%3Aname%3DNum', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/metrics/v2/read/test%3Aname%3DNum' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/metrics/v2/read/test%3Aname%3DNum (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_metric_v2_version_string(self, api): 150s metrics_body = { 150s "request": {"mbean": "test:name=Num", "type": "read"}, 150s "value": {"Value": 0}, 150s "timestamp": 0, 150s "status": 200, 150s } 150s 150s httpretty.enable() 150s httpretty.register_uri( 150s httpretty.GET, 150s "http://localhost:8080/metrics/v2/read/test:name=Num", 150s body=json.dumps(metrics_body), 150s ) 150s > metric = api.metric("test:name=Num", version="v2") 150s 150s tests/test_api_metrics.py:96: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/metrics.py:31: in metric 150s res = self._query("metrics", path=self._escape_metric_name(metric)) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/metrics/v2/read/test%3Aname%3DNum (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(236 bytes) to ://localhost:8080http://localhost:8080/metrics/v2/read/test%3Aname%3DNum via GET at 1750344358.2039514 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s _____________________ TestMetricsAPI.test_metric_v2_error ______________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/metrics/v2/read/test%3Aname%3DNum' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/metrics/v2/read/test%3Aname%3DNum', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/metrics/v2/read/test%3Aname%3DNum' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/metrics/v2/read/test%3Aname%3DNum (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_metric_v2_error(self, api): 150s metrics_body = { 150s "request": {"mbean": "test:name=Num", "type": "read"}, 150s "error_type": "javax.management.InstanceNotFoundException", 150s "error": "javax.management.InstanceNotFoundException : test:name=Num", 150s "status": 404, 150s } 150s 150s httpretty.enable() 150s httpretty.register_uri( 150s httpretty.GET, 150s "http://localhost:8080/metrics/v2/read/test:name=Num", 150s body=json.dumps(metrics_body), 150s ) 150s with pytest.raises(pypuppetdb.errors.APIError): 150s > api.metric("test:name=Num") 150s 150s tests/test_api_metrics.py:117: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/metrics.py:31: in metric 150s res = self._query("metrics", path=self._escape_metric_name(metric)) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/metrics/v2/read/test%3Aname%3DNum (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(236 bytes) to ://localhost:8080http://localhost:8080/metrics/v2/read/test%3Aname%3DNum via GET at 1750344358.3197124 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ___________ TestMetricsAPI.test_metric_v2_escape_special_characters ____________ 150s 150s self = 150s method = 'GET' 150s url = 'http://localhost:8080/metrics/v2/read/test%3Aspecial%21/chars%21%21metric%21%22name' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/metrics/v2/read/test%3Aspecial%21/chars%21%21metric%21%22name', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET' 150s url = 'http://localhost:8080/metrics/v2/read/test%3Aspecial%21/chars%21%21metric%21%22name' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/metrics/v2/read/test%3Aspecial%21/chars%21%21metric%21%22name (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_metric_v2_escape_special_characters(self, api): 150s metrics_body = { 150s "request": {"mbean": "test:name=Num", "type": "read"}, 150s "value": {"Value": 0}, 150s "timestamp": 0, 150s "status": 200, 150s } 150s 150s httpretty.enable() 150s metric_name = 'test:special/chars!metric"name' 150s metric_escaped = 'test:special!/chars!!metric!"name' 150s metric_escaped_urlencoded = "test%3Aspecial%21/chars%21%21metric%21%22name" 150s httpretty.register_uri( 150s httpretty.GET, 150s ("http://localhost:8080/metrics/v2/read/" + metric_escaped), 150s body=json.dumps(metrics_body), 150s ) 150s > metric = api.metric(metric_name) 150s 150s tests/test_api_metrics.py:139: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/metrics.py:31: in metric 150s res = self._query("metrics", path=self._escape_metric_name(metric)) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/metrics/v2/read/test%3Aspecial%21/chars%21%21metric%21%22name (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(264 bytes) to ://localhost:8080http://localhost:8080/metrics/v2/read/test%3Aspecial%21/chars%21%21metric%21%22name via GET at 1750344358.4338295 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ______________________ TestMetricsAPI.test_metric_v2_list ______________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/metrics/v2/list', body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/metrics/v2/list', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/metrics/v2/list', response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/metrics/v2/list (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_metric_v2_list(self, api): 150s # test metric() (no arguments) 150s metrics_body = { 150s "request": {"type": "list"}, 150s "value": { 150s "java.util.logging": {"type=Logging": {}}, 150s }, 150s "timestamp": 0, 150s "status": 200, 150s } 150s 150s httpretty.enable() 150s httpretty.register_uri( 150s httpretty.GET, 150s "http://localhost:8080/metrics/v2/list", 150s body=json.dumps(metrics_body), 150s ) 150s > metric = api.metric() 150s 150s tests/test_api_metrics.py:164: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/metrics.py:29: in metric 150s res = self._query("metrics-list") 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/metrics/v2/list (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(218 bytes) to ://localhost:8080http://localhost:8080/metrics/v2/list via GET at 1750344358.5461922 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s _________________________ TestCommandAPI.test_command __________________________ 150s 150s self = 150s method = 'POST' 150s url = 'http://localhost:8080/pdb/cmd/v1?command=deactivate+node&version=3&certname=testnode&checksum=b93d474970e54943aec050ee399dfb85d21e143a' 150s body = '{"certname": "testnode"}' 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8', 'Content-Length': '24'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/cmd/v1', query='command=deactivate+node&version=3&certname=testnode&checksum=b93d474970e54943aec050ee399dfb85d21e143a', fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'POST' 150s url = 'http://localhost:8080/pdb/cmd/v1?command=deactivate+node&version=3&certname=testnode&checksum=b93d474970e54943aec050ee399dfb85d21e143a' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/cmd/v1?command=deactivate+node&version=3&certname=testnode&checksum=b93d474970e54943aec050ee399dfb85d21e143a (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_command(self, api): 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/cmd/v1", method=httpretty.POST) 150s > api.command("deactivate node", {"certname": "testnode"}) 150s 150s tests/test_api_other.py:32: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/command.py:19: in command 150s return self._cmd(command, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/command.py:58: in _cmd 150s r = self.session.post( 150s /usr/lib/python3/dist-packages/requests/sessions.py:637: in post 150s return self.request("POST", url, data=data, json=json, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/cmd/v1?command=deactivate+node&version=3&certname=testnode&checksum=b93d474970e54943aec050ee399dfb85d21e143a (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(336 bytes) to ://localhost:8080http://localhost:8080/pdb/cmd/v1?command=deactivate+node&version=3&certname=testnode&checksum=b93d474970e54943aec050ee399dfb85d21e143a via POST at 1750344358.661648 150s ERROR pypuppetdb.api.command:command.py:87 Could not reach PuppetDB on localhost:8080 over HTTP. 150s _______________________ TestCommandAPI.test_cmd[string] ________________________ 150s 150s self = 150s method = 'POST' 150s url = 'http://localhost:8080/pdb/cmd/v1?command=deactivate+node&version=3&certname=testnode&checksum=b93d474970e54943aec050ee399dfb85d21e143a' 150s body = '{"certname": "testnode"}' 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8', 'Content-Length': '24'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/cmd/v1', query='command=deactivate+node&version=3&certname=testnode&checksum=b93d474970e54943aec050ee399dfb85d21e143a', fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'POST' 150s url = 'http://localhost:8080/pdb/cmd/v1?command=deactivate+node&version=3&certname=testnode&checksum=b93d474970e54943aec050ee399dfb85d21e143a' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/cmd/v1?command=deactivate+node&version=3&certname=testnode&checksum=b93d474970e54943aec050ee399dfb85d21e143a (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s query = '["certname", "=", "node1"]' 150s 150s def test_cmd(self, api, query): 150s httpretty.reset() 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/cmd/v1", method=httpretty.POST) 150s node_name = "testnode" 150s > api._cmd("deactivate node", {"certname": node_name}) 150s 150s tests/test_api_other.py:42: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/command.py:58: in _cmd 150s r = self.session.post( 150s /usr/lib/python3/dist-packages/requests/sessions.py:637: in post 150s return self.request("POST", url, data=data, json=json, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/cmd/v1?command=deactivate+node&version=3&certname=testnode&checksum=b93d474970e54943aec050ee399dfb85d21e143a (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(336 bytes) to ://localhost:8080http://localhost:8080/pdb/cmd/v1?command=deactivate+node&version=3&certname=testnode&checksum=b93d474970e54943aec050ee399dfb85d21e143a via POST at 1750344358.7702959 150s ERROR pypuppetdb.api.command:command.py:87 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ____________________ TestCommandAPI.test_cmd[QueryBuilder] _____________________ 150s 150s self = 150s method = 'POST' 150s url = 'http://localhost:8080/pdb/cmd/v1?command=deactivate+node&version=3&certname=testnode&checksum=b93d474970e54943aec050ee399dfb85d21e143a' 150s body = '{"certname": "testnode"}' 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8', 'Content-Length': '24'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/cmd/v1', query='command=deactivate+node&version=3&certname=testnode&checksum=b93d474970e54943aec050ee399dfb85d21e143a', fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'POST' 150s url = 'http://localhost:8080/pdb/cmd/v1?command=deactivate+node&version=3&certname=testnode&checksum=b93d474970e54943aec050ee399dfb85d21e143a' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/cmd/v1?command=deactivate+node&version=3&certname=testnode&checksum=b93d474970e54943aec050ee399dfb85d21e143a (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s query = Query: ["=", "certname", "node1"] 150s 150s def test_cmd(self, api, query): 150s httpretty.reset() 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/cmd/v1", method=httpretty.POST) 150s node_name = "testnode" 150s > api._cmd("deactivate node", {"certname": node_name}) 150s 150s tests/test_api_other.py:42: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/command.py:58: in _cmd 150s r = self.session.post( 150s /usr/lib/python3/dist-packages/requests/sessions.py:637: in post 150s return self.request("POST", url, data=data, json=json, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/cmd/v1?command=deactivate+node&version=3&certname=testnode&checksum=b93d474970e54943aec050ee399dfb85d21e143a (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(336 bytes) to ://localhost:8080http://localhost:8080/pdb/cmd/v1?command=deactivate+node&version=3&certname=testnode&checksum=b93d474970e54943aec050ee399dfb85d21e143a via POST at 1750344358.8788662 150s ERROR pypuppetdb.api.command:command.py:87 Could not reach PuppetDB on localhost:8080 over HTTP. 150s _______________ TestCommandAPI.test_cmd_with_token_authorization _______________ 150s 150s self = 150s method = 'POST' 150s url = '/pdb/cmd/v1?command=deactivate+node&version=3&certname=&checksum=1d150468bd137c6511986985d707dc451d093a9c' 150s body = '{"certname": ""}' 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8', 'X-Authentication': 'tokenstring', 'Content-Length': '16'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/pdb/cmd/v1', query='command=deactivate+node&version=3&certname=&checksum=1d150468bd137c6511986985d707dc451d093a9c', fragment=None) 150s destination_scheme = None, conn = None, release_this_conn = True 150s http_tunnel_required = True, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s > self._prepare_proxy(conn) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:773: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:1042: in _prepare_proxy 150s conn.connect() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:721: in connect 150s self._tunnel() 150s /usr/lib/python3.13/http/client.py:971: in _tunnel 150s (version, code, message) = response._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:474: in increment 150s raise reraise(type(error), error, _stacktrace) 150s /usr/lib/python3/dist-packages/urllib3/util/util.py:38: in reraise 150s raise value.with_traceback(tb) 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:773: in urlopen 150s self._prepare_proxy(conn) 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:1042: in _prepare_proxy 150s conn.connect() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:721: in connect 150s self._tunnel() 150s /usr/lib/python3.13/http/client.py:971: in _tunnel 150s (version, code, message) = response._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response')) 150s 150s /usr/lib/python3.13/http/client.py:300: ProtocolError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s token_api = 150s 150s def test_cmd_with_token_authorization(self, token_api): 150s httpretty.enable() 150s stub_request("https://localhost:8080/pdb/cmd/v1", method=httpretty.POST) 150s > token_api._cmd("deactivate node", {"certname": ""}) 150s 150s tests/test_api_other.py:69: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/command.py:58: in _cmd 150s r = self.session.post( 150s /usr/lib/python3/dist-packages/requests/sessions.py:637: in post 150s return self.request("POST", url, data=data, json=json, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s > raise ConnectionError(err, request=request) 150s E requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response')) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:682: ConnectionError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(57 bytes) to ://localhost:8080localhost:8080 via CONNECT at 1750344358.9852047 150s ERROR pypuppetdb.api.command:command.py:87 Could not reach PuppetDB on localhost:8080 over HTTPS. 150s __________________________ TestStatusAPI.test_status ___________________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/status/v1/services/puppetdb-status' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/status/v1/services/puppetdb-status', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/status/v1/services/puppetdb-status' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/status/v1/services/puppetdb-status (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_status(self, api): 150s httpretty.enable() 150s stub_request("http://localhost:8080/status/v1/services/puppetdb-status") 150s > api.status() 150s 150s tests/test_api_other.py:78: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/status.py:19: in status 150s return self._query("status") 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/status/v1/services/puppetdb-status (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(237 bytes) to ://localhost:8080http://localhost:8080/status/v1/services/puppetdb-status via GET at 1750344359.11571 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s _________________________ TestPqlAPI.test_pql_casting __________________________ 150s 150s self = 150s method = 'GET' 150s url = 'http://localhost:8080/pdb/query/v4?query=nodes+%7B%0A++++++++++++facts+%7B%0A++++++++++++++name+%3D+%22operatingsystem%22+and%0A++++++++++++++value+%3D+%22Debian%22%0A++++++++++++%7D%0A++++++++++%7D' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4', query='query=nodes+%7B%0A++++++++++++...%3D+%22operatingsystem%22+and%0A++++++++++++++value+%3D+%22Debian%22%0A++++++++++++%7D%0A++++++++++%7D', fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET' 150s url = 'http://localhost:8080/pdb/query/v4?query=nodes+%7B%0A++++++++++++facts+%7B%0A++++++++++++++name+%3D+%22operatingsystem%22+and%0A++++++++++++++value+%3D+%22Debian%22%0A++++++++++++%7D%0A++++++++++%7D' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4?query=nodes+%7B%0A++++++++++++facts+%7B%0A++++++++++++++name+%3D+%22operatingsystem%22+and%0A++++++++++++++value+%3D+%22Debian%22%0A++++++++++++%7D%0A++++++++++%7D (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_pql_casting(self, api): 150s pql_query = """ 150s nodes { 150s facts { 150s name = "operatingsystem" and 150s value = "Debian" 150s } 150s } 150s """ 150s pql_body = [ 150s { 150s "cached_catalog_status": "not_used", 150s "catalog_environment": "production", 150s "catalog_timestamp": "2016-08-15T11:06:26.275Z", 150s "certname": "greenserver.vm", 150s "deactivated": None, 150s "expired": None, 150s "facts_environment": "production", 150s "facts_timestamp": "2016-08-15T11:06:26.140Z", 150s "latest_report_hash": "4a956674b016d95a7b77c99513ba26e4a744f8d1", 150s "latest_report_noop": False, 150s "latest_report_noop_pending": None, 150s "latest_report_status": "changed", 150s "report_environment": "production", 150s "report_timestamp": "2016-08-15T11:06:18.393Z", 150s }, 150s { 150s "cached_catalog_status": "not_used", 150s "catalog_environment": "production", 150s "catalog_timestamp": "2016-08-15T11:06:26.275Z", 150s "certname": "blueserver.vm", 150s "deactivated": None, 150s "expired": None, 150s "facts_environment": "production", 150s "facts_timestamp": "2016-08-15T11:06:26.140Z", 150s "latest_report_hash": "4a956674b016d95a7b77c99513ba26e4a744f8d1", 150s "latest_report_noop": False, 150s "latest_report_noop_pending": None, 150s "latest_report_status": "changed", 150s "report_environment": "production", 150s "report_timestamp": "2016-08-15T11:06:18.393Z", 150s }, 150s ] 150s pql_url = "http://localhost:8080/pdb/query/v4" 150s 150s httpretty.enable() 150s httpretty.register_uri(httpretty.GET, pql_url, body=json.dumps(pql_body)) 150s 150s > nodes = list(api.pql(pql_query)) 150s 150s tests/test_api_pql.py:57: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/pql.py:87: in pql 150s for element in self._pql(pql=pql): 150s /usr/lib/python3/dist-packages/pypuppetdb/api/pql.py:47: in _pql 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4?query=nodes+%7B%0A++++++++++++facts+%7B%0A++++++++++++++name+%3D+%22operatingsystem%22+and%0A++++++++++++++value+%3D+%22Debian%22%0A++++++++++++%7D%0A++++++++++%7D (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(379 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4?query=nodes+%7B%0A++++++++++++facts+%7B%0A++++++++++++++name+%3D+%22operatingsystem%22+and%0A++++++++++++++value+%3D+%22Debian%22%0A++++++++++++%7D%0A++++++++++%7D via GET at 1750344359.224996 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ________________________ TestPqlAPI.test_pql_no_casting ________________________ 150s 150s self = 150s method = 'GET' 150s url = 'http://localhost:8080/pdb/query/v4?query=nodes%5Bcertname%5D+%7B%0A++++++++++++facts+%7B%0A++++++++++++++name+%3D+%22operatingsystem%22+and%0A++++++++++++++value+%3D+%22Debian%22%0A++++++++++++%7D%0A++++++++++%7D' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4', query='query=nodes%5Bcertname%5D+%7B%...%3D+%22operatingsystem%22+and%0A++++++++++++++value+%3D+%22Debian%22%0A++++++++++++%7D%0A++++++++++%7D', fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET' 150s url = 'http://localhost:8080/pdb/query/v4?query=nodes%5Bcertname%5D+%7B%0A++++++++++++facts+%7B%0A++++++++++++++name+%3D+%22operatingsystem%22+and%0A++++++++++++++value+%3D+%22Debian%22%0A++++++++++++%7D%0A++++++++++%7D' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4?query=nodes%5Bcertname%5D+%7B%0A++++++++++++facts+%7B%0A++++++++++++++name+%3D+%22operatingsystem%22+and%0A++++++++++++++value+%3D+%22Debian%22%0A++++++++++++%7D%0A++++++++++%7D (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_pql_no_casting(self, api): 150s pql_query = """ 150s nodes[certname] { 150s facts { 150s name = "operatingsystem" and 150s value = "Debian" 150s } 150s } 150s """ 150s pql_body = [ 150s {"certname": "foo.example.com"}, 150s {"certname": "bar.example.com"}, 150s ] 150s pql_url = "http://localhost:8080/pdb/query/v4" 150s 150s httpretty.enable() 150s httpretty.register_uri(httpretty.GET, pql_url, body=json.dumps(pql_body)) 150s 150s > elements = list(api.pql(pql_query)) 150s 150s tests/test_api_pql.py:83: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/pql.py:87: in pql 150s for element in self._pql(pql=pql): 150s /usr/lib/python3/dist-packages/pypuppetdb/api/pql.py:47: in _pql 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4?query=nodes%5Bcertname%5D+%7B%0A++++++++++++facts+%7B%0A++++++++++++++name+%3D+%22operatingsystem%22+and%0A++++++++++++++value+%3D+%22Debian%22%0A++++++++++++%7D%0A++++++++++%7D (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(393 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4?query=nodes%5Bcertname%5D+%7B%0A++++++++++++facts+%7B%0A++++++++++++++name+%3D+%22operatingsystem%22+and%0A++++++++++++++value+%3D+%22Debian%22%0A++++++++++++%7D%0A++++++++++%7D via GET at 1750344359.3378232 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ___________________________ TestQueryAPI.test_facts ____________________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/facts', body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/facts', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/facts' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/facts (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_facts(self, api): 150s facts_body = [ 150s { 150s "certname": "test_certname", 150s "name": "test_name", 150s "value": "test_value", 150s "environment": "test_environment", 150s } 150s ] 150s facts_url = "http://localhost:8080/pdb/query/v4/facts" 150s 150s httpretty.enable() 150s httpretty.register_uri(httpretty.GET, facts_url, body=json.dumps(facts_body)) 150s 150s > for fact in api.facts(): 150s 150s tests/test_api_query.py:43: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/query.py:140: in facts 150s facts = self._query("facts", path=path, **kwargs) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/facts (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(221 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/facts via GET at 1750344359.4510975 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s _________________________ TestQueryAPI.test_fact_names _________________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/fact-names' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/fact-names', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/fact-names' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/fact-names (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_fact_names(self, api): 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/query/v4/fact-names") 150s > api.fact_names() 150s 150s tests/test_api_query.py:54: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/query.py:331: in fact_names 150s return self._query("fact-names") 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/fact-names (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(226 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/fact-names via GET at 1750344359.5643938 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ________________________ TestQueryAPI.test_environments ________________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/environments' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/environments', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/environments' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/environments (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_environments(self, api): 150s httpretty.enable() 150s stub_request("http://localhost:8080/pdb/query/v4/environments") 150s > api.environments() 150s 150s tests/test_api_query.py:66: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/query.py:116: in environments 150s return self._query("environments", **kwargs) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/environments (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(228 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/environments via GET at 1750344359.6938248 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s _________________________ TestQueryAPI.test_inventory __________________________ 150s 150s self = 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/inventory' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/inventory', query=None, fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET', url = 'http://localhost:8080/pdb/query/v4/inventory' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/inventory (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_inventory(self, api): 150s inventory_body = [ 150s { 150s "certname": "test_certname", 150s "timestamp": "2017-06-05T20:18:23.374Z", 150s "environment": "test_environment", 150s "facts": "test_facts", 150s "trusted": "test_trusted", 150s } 150s ] 150s inventory_url = "http://localhost:8080/pdb/query/v4/inventory" 150s 150s httpretty.enable() 150s httpretty.register_uri( 150s httpretty.GET, inventory_url, body=json.dumps(inventory_body) 150s ) 150s > for inv in api.inventory(): 150s 150s tests/test_api_query.py:87: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/query.py:360: in inventory 150s inventory = self._query("inventory", **kwargs) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/inventory (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(225 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/inventory via GET at 1750344359.8080735 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s ________________________ TestQueryAPI.test_nodes_single ________________________ 150s 150s self = 150s method = 'GET' 150s url = 'http://localhost:8080/pdb/query/v4/nodes?query=%5B%22%3D%22%2C%22certname%22%2C%22greenserver.vm%22' 150s body = None 150s headers = {'content-type': 'application/json', 'accept': 'application/json', 'accept-charset': 'utf-8'} 150s retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s redirect = False, assert_same_host = False 150s timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 150s release_conn = False, chunked = False, body_pos = None, preload_content = False 150s decode_content = False, response_kw = {} 150s parsed_url = Url(scheme='http', auth=None, host='localhost', port=8080, path='/pdb/query/v4/nodes', query='query=%5B%22%3D%22%2C%22certname%22%2C%22greenserver.vm%22', fragment=None) 150s destination_scheme = 'http', conn = None, release_this_conn = True 150s http_tunnel_required = False, err = None, clean_exit = False 150s 150s def urlopen( # type: ignore[override] 150s self, 150s method: str, 150s url: str, 150s body: _TYPE_BODY | None = None, 150s headers: typing.Mapping[str, str] | None = None, 150s retries: Retry | bool | int | None = None, 150s redirect: bool = True, 150s assert_same_host: bool = True, 150s timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 150s pool_timeout: int | None = None, 150s release_conn: bool | None = None, 150s chunked: bool = False, 150s body_pos: _TYPE_BODY_POSITION | None = None, 150s preload_content: bool = True, 150s decode_content: bool = True, 150s **response_kw: typing.Any, 150s ) -> BaseHTTPResponse: 150s """ 150s Get a connection from the pool and perform an HTTP request. This is the 150s lowest level call for making a request, so you'll need to specify all 150s the raw details. 150s 150s .. note:: 150s 150s More commonly, it's appropriate to use a convenience method 150s such as :meth:`request`. 150s 150s .. note:: 150s 150s `release_conn` will only behave as expected if 150s `preload_content=False` because we want to make 150s `preload_content=False` the default behaviour someday soon without 150s breaking backwards compatibility. 150s 150s :param method: 150s HTTP request method (such as GET, POST, PUT, etc.) 150s 150s :param url: 150s The URL to perform the request on. 150s 150s :param body: 150s Data to send in the request body, either :class:`str`, :class:`bytes`, 150s an iterable of :class:`str`/:class:`bytes`, or a file-like object. 150s 150s :param headers: 150s Dictionary of custom headers to send, such as User-Agent, 150s If-None-Match, etc. If None, pool headers are used. If provided, 150s these headers completely replace any pool-specific headers. 150s 150s :param retries: 150s Configure the number of retries to allow before raising a 150s :class:`~urllib3.exceptions.MaxRetryError` exception. 150s 150s If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 150s :class:`~urllib3.util.retry.Retry` object for fine-grained control 150s over different types of retries. 150s Pass an integer number to retry connection errors that many times, 150s but no other types of errors. Pass zero to never retry. 150s 150s If ``False``, then retries are disabled and any exception is raised 150s immediately. Also, instead of raising a MaxRetryError on redirects, 150s the redirect response will be returned. 150s 150s :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 150s 150s :param redirect: 150s If True, automatically handle redirects (status codes 301, 302, 150s 303, 307, 308). Each redirect counts as a retry. Disabling retries 150s will disable redirect, too. 150s 150s :param assert_same_host: 150s If ``True``, will make sure that the host of the pool requests is 150s consistent else will raise HostChangedError. When ``False``, you can 150s use the pool on an HTTP proxy and request foreign hosts. 150s 150s :param timeout: 150s If specified, overrides the default timeout for this one 150s request. It may be a float (in seconds) or an instance of 150s :class:`urllib3.util.Timeout`. 150s 150s :param pool_timeout: 150s If set and the pool is set to block=True, then this method will 150s block for ``pool_timeout`` seconds and raise EmptyPoolError if no 150s connection is available within the time period. 150s 150s :param bool preload_content: 150s If True, the response's body will be preloaded into memory. 150s 150s :param bool decode_content: 150s If True, will attempt to decode the body based on the 150s 'content-encoding' header. 150s 150s :param release_conn: 150s If False, then the urlopen call will not release the connection 150s back into the pool once a response is received (but will release if 150s you read the entire contents of the response such as when 150s `preload_content=True`). This is useful if you're not preloading 150s the response's content immediately. You will need to call 150s ``r.release_conn()`` on the response ``r`` to return the connection 150s back into the pool. If None, it takes the value of ``preload_content`` 150s which defaults to ``True``. 150s 150s :param bool chunked: 150s If True, urllib3 will send the body using chunked transfer 150s encoding. Otherwise, urllib3 will send the body using the standard 150s content-length form. Defaults to False. 150s 150s :param int body_pos: 150s Position to seek to in file-like body in the event of a retry or 150s redirect. Typically this won't need to be set because urllib3 will 150s auto-populate the value when needed. 150s """ 150s parsed_url = parse_url(url) 150s destination_scheme = parsed_url.scheme 150s 150s if headers is None: 150s headers = self.headers 150s 150s if not isinstance(retries, Retry): 150s retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 150s 150s if release_conn is None: 150s release_conn = preload_content 150s 150s # Check host 150s if assert_same_host and not self.is_same_host(url): 150s raise HostChangedError(self, url, retries) 150s 150s # Ensure that the URL we're connecting to is properly encoded 150s if url.startswith("/"): 150s url = to_str(_encode_target(url)) 150s else: 150s url = to_str(parsed_url.url) 150s 150s conn = None 150s 150s # Track whether `conn` needs to be released before 150s # returning/raising/recursing. Update this variable if necessary, and 150s # leave `release_conn` constant throughout the function. That way, if 150s # the function recurses, the original value of `release_conn` will be 150s # passed down into the recursive call, and its value will be respected. 150s # 150s # See issue #651 [1] for details. 150s # 150s # [1] 150s release_this_conn = release_conn 150s 150s http_tunnel_required = connection_requires_http_tunnel( 150s self.proxy, self.proxy_config, destination_scheme 150s ) 150s 150s # Merge the proxy headers. Only done when not using HTTP CONNECT. We 150s # have to copy the headers dict so we can safely change it without those 150s # changes being reflected in anyone else's copy. 150s if not http_tunnel_required: 150s headers = headers.copy() # type: ignore[attr-defined] 150s headers.update(self.proxy_headers) # type: ignore[union-attr] 150s 150s # Must keep the exception bound to a separate variable or else Python 3 150s # complains about UnboundLocalError. 150s err = None 150s 150s # Keep track of whether we cleanly exited the except block. This 150s # ensures we do proper cleanup in finally. 150s clean_exit = False 150s 150s # Rewind body position, if needed. Record current position 150s # for future rewinds in the event of a redirect/retry. 150s body_pos = set_file_position(body, body_pos) 150s 150s try: 150s # Request a connection from the queue. 150s timeout_obj = self._get_timeout(timeout) 150s conn = self._get_conn(timeout=pool_timeout) 150s 150s conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 150s 150s # Is this a closed/new connection that requires CONNECT tunnelling? 150s if self.proxy is not None and http_tunnel_required and conn.is_closed: 150s try: 150s self._prepare_proxy(conn) 150s except (BaseSSLError, OSError, SocketTimeout) as e: 150s self._raise_timeout( 150s err=e, url=self.proxy.url, timeout_value=conn.timeout 150s ) 150s raise 150s 150s # If we're going to release the connection in ``finally:``, then 150s # the response doesn't need to know about the connection. Otherwise 150s # it will also try to release it and we'll have a double-release 150s # mess. 150s response_conn = conn if not release_conn else None 150s 150s # Make the request on the HTTPConnection object 150s > response = self._make_request( 150s conn, 150s method, 150s url, 150s timeout=timeout_obj, 150s body=body, 150s headers=headers, 150s chunked=chunked, 150s retries=retries, 150s response_conn=response_conn, 150s preload_content=preload_content, 150s decode_content=decode_content, 150s **response_kw, 150s ) 150s 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:534: in _make_request 150s response = conn.getresponse() 150s /usr/lib/python3/dist-packages/urllib3/connection.py:516: in getresponse 150s httplib_response = super().getresponse() 150s /usr/lib/python3.13/http/client.py:1430: in getresponse 150s response.begin() 150s /usr/lib/python3.13/http/client.py:331: in begin 150s version, status, reason = self._read_status() 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s 150s def _read_status(self): 150s line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 150s if len(line) > _MAXLINE: 150s raise LineTooLong("status line") 150s if self.debuglevel > 0: 150s print("reply:", repr(line)) 150s if not line: 150s # Presumably, the server closed the connection before 150s # sending a valid response. 150s > raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s E http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s /usr/lib/python3.13/http/client.py:300: RemoteDisconnected 150s 150s The above exception was the direct cause of the following exception: 150s Traceback (most recent call last): 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 787, in urlopen 150s response = self._make_request( 150s conn, 150s ...<10 lines>... 150s **response_kw, 150s ) 150s File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 534, in _make_request 150s response = conn.getresponse() 150s File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 516, in getresponse 150s httplib_response = super().getresponse() 150s File "/usr/lib/python3.13/http/client.py", line 1430, in getresponse 150s response.begin() 150s ~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 331, in begin 150s version, status, reason = self._read_status() 150s ~~~~~~~~~~~~~~~~~^^ 150s File "/usr/lib/python3.13/http/client.py", line 300, in _read_status 150s raise RemoteDisconnected("Remote end closed connection without" 150s " response") 150s http.client.RemoteDisconnected: Remote end closed connection without response 150s 150s The above exception was the direct cause of the following exception: 150s 150s urllib3.exceptions.ProxyError: ('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s 150s The above exception was the direct cause of the following exception: 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s > resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:667: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/urllib3/connectionpool.py:841: in urlopen 150s retries = retries.increment( 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 150s method = 'GET' 150s url = 'http://localhost:8080/pdb/query/v4/nodes?query=%5B%22%3D%22%2C%22certname%22%2C%22greenserver.vm%22' 150s response = None 150s error = ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response')) 150s _pool = 150s _stacktrace = 150s 150s def increment( 150s self, 150s method: str | None = None, 150s url: str | None = None, 150s response: BaseHTTPResponse | None = None, 150s error: Exception | None = None, 150s _pool: ConnectionPool | None = None, 150s _stacktrace: TracebackType | None = None, 150s ) -> Self: 150s """Return a new Retry object with incremented retry counters. 150s 150s :param response: A response object, or None, if the server did not 150s return a response. 150s :type response: :class:`~urllib3.response.BaseHTTPResponse` 150s :param Exception error: An error encountered during the request, or 150s None if the response was received successfully. 150s 150s :return: A new ``Retry`` object. 150s """ 150s if self.total is False and error: 150s # Disabled, indicate to re-raise the error. 150s raise reraise(type(error), error, _stacktrace) 150s 150s total = self.total 150s if total is not None: 150s total -= 1 150s 150s connect = self.connect 150s read = self.read 150s redirect = self.redirect 150s status_count = self.status 150s other = self.other 150s cause = "unknown" 150s status = None 150s redirect_location = None 150s 150s if error and self._is_connection_error(error): 150s # Connect retry? 150s if connect is False: 150s raise reraise(type(error), error, _stacktrace) 150s elif connect is not None: 150s connect -= 1 150s 150s elif error and self._is_read_error(error): 150s # Read retry? 150s if read is False or method is None or not self._is_method_retryable(method): 150s raise reraise(type(error), error, _stacktrace) 150s elif read is not None: 150s read -= 1 150s 150s elif error: 150s # Other retry? 150s if other is not None: 150s other -= 1 150s 150s elif response and response.get_redirect_location(): 150s # Redirect retry? 150s if redirect is not None: 150s redirect -= 1 150s cause = "too many redirects" 150s response_redirect_location = response.get_redirect_location() 150s if response_redirect_location: 150s redirect_location = response_redirect_location 150s status = response.status 150s 150s else: 150s # Incrementing because of a server error like a 500 in 150s # status_forcelist and the given method is in the allowed_methods 150s cause = ResponseError.GENERIC_ERROR 150s if response and response.status: 150s if status_count is not None: 150s status_count -= 1 150s cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 150s status = response.status 150s 150s history = self.history + ( 150s RequestHistory(method, url, error, status, redirect_location), 150s ) 150s 150s new_retry = self.new( 150s total=total, 150s connect=connect, 150s read=read, 150s redirect=redirect, 150s status=status_count, 150s other=other, 150s history=history, 150s ) 150s 150s if new_retry.is_exhausted(): 150s reason = error or ResponseError(cause) 150s > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 150s E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?query=%5B%22%3D%22%2C%22certname%22%2C%22greenserver.vm%22 (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/urllib3/util/retry.py:519: MaxRetryError 150s 150s During handling of the above exception, another exception occurred: 150s 150s self = 150s api = 150s 150s def test_nodes_single(self, api): 150s body = { 150s "cached_catalog_status": "not_used", 150s "catalog_environment": "production", 150s "catalog_timestamp": "2016-08-15T11:06:26.275Z", 150s "certname": "greenserver.vm", 150s "deactivated": None, 150s "expired": None, 150s "facts_environment": "production", 150s "facts_timestamp": "2016-08-15T11:06:26.140Z", 150s "latest_report_hash": "4a956674b016d95a7b77c99513ba26e4a744f8d1", 150s "latest_report_noop": False, 150s "latest_report_noop_pending": None, 150s "latest_report_status": "changed", 150s "report_environment": "production", 150s "report_timestamp": "2016-08-15T11:06:18.393Z", 150s } 150s url = "http://localhost:8080/pdb/query/v4/nodes" 150s 150s httpretty.enable() 150s httpretty.register_uri(httpretty.GET, url, body=json.dumps(body)) 150s 150s > nodes = list(api.nodes(query='["=","certname","greenserver.vm"')) 150s 150s tests/test_api_query.py:117: 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s /usr/lib/python3/dist-packages/pypuppetdb/api/query.py:51: in nodes 150s nodes = self._query("nodes", **kwargs) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:389: in _query 150s return self._make_request(url, request_method, payload) 150s /usr/lib/python3/dist-packages/pypuppetdb/api/base.py:410: in _make_request 150s r = self.session.get( 150s /usr/lib/python3/dist-packages/requests/sessions.py:602: in get 150s return self.request("GET", url, **kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:589: in request 150s resp = self.send(prep, **send_kwargs) 150s /usr/lib/python3/dist-packages/requests/sessions.py:703: in send 150s r = adapter.send(request, **kwargs) 150s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 150s 150s self = 150s request = , stream = False 150s timeout = Timeout(connect=10, read=10, total=None), verify = True 150s cert = (None, None) 150s proxies = OrderedDict({'https': 'http://egress.ps7.internal:3128/', 'http': 'http://egress.ps7.internal:3128/'}) 150s 150s def send( 150s self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 150s ): 150s """Sends PreparedRequest object. Returns Response object. 150s 150s :param request: The :class:`PreparedRequest ` being sent. 150s :param stream: (optional) Whether to stream the request content. 150s :param timeout: (optional) How long to wait for the server to send 150s data before giving up, as a float, or a :ref:`(connect timeout, 150s read timeout) ` tuple. 150s :type timeout: float or tuple or urllib3 Timeout object 150s :param verify: (optional) Either a boolean, in which case it controls whether 150s we verify the server's TLS certificate, or a string, in which case it 150s must be a path to a CA bundle to use 150s :param cert: (optional) Any user-provided SSL certificate to be trusted. 150s :param proxies: (optional) The proxies dictionary to apply to the request. 150s :rtype: requests.Response 150s """ 150s 150s try: 150s conn = self.get_connection_with_tls_context( 150s request, verify, proxies=proxies, cert=cert 150s ) 150s except LocationValueError as e: 150s raise InvalidURL(e, request=request) 150s 150s self.cert_verify(conn, request.url, verify, cert) 150s url = self.request_url(request, proxies) 150s self.add_headers( 150s request, 150s stream=stream, 150s timeout=timeout, 150s verify=verify, 150s cert=cert, 150s proxies=proxies, 150s ) 150s 150s chunked = not (request.body is None or "Content-Length" in request.headers) 150s 150s if isinstance(timeout, tuple): 150s try: 150s connect, read = timeout 150s timeout = TimeoutSauce(connect=connect, read=read) 150s except ValueError: 150s raise ValueError( 150s f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 150s f"or a single float to set both timeouts to the same value." 150s ) 150s elif isinstance(timeout, TimeoutSauce): 150s pass 150s else: 150s timeout = TimeoutSauce(connect=timeout, read=timeout) 150s 150s try: 150s resp = conn.urlopen( 150s method=request.method, 150s url=url, 150s body=request.body, 150s headers=request.headers, 150s redirect=False, 150s assert_same_host=False, 150s preload_content=False, 150s decode_content=False, 150s retries=self.max_retries, 150s timeout=timeout, 150s chunked=chunked, 150s ) 150s 150s except (ProtocolError, OSError) as err: 150s raise ConnectionError(err, request=request) 150s 150s except MaxRetryError as e: 150s if isinstance(e.reason, ConnectTimeoutError): 150s # TODO: Remove this in 3.0.0: see #2811 150s if not isinstance(e.reason, NewConnectionError): 150s raise ConnectTimeout(e, request=request) 150s 150s if isinstance(e.reason, ResponseError): 150s raise RetryError(e, request=request) 150s 150s if isinstance(e.reason, _ProxyError): 150s > raise ProxyError(e, request=request) 150s E requests.exceptions.ProxyError: HTTPConnectionPool(host='egress.ps7.internal', port=3128): Max retries exceeded with url: http://localhost:8080/pdb/query/v4/nodes?query=%5B%22%3D%22%2C%22certname%22%2C%22greenserver.vm%22 (Caused by ProxyError('Unable to connect to proxy', RemoteDisconnected('Remote end closed connection without response'))) 150s 150s /usr/lib/python3/dist-packages/requests/adapters.py:694: ProxyError 150s ------------------------------ Captured log call ------------------------------- 150s WARNING httpretty.core:core.py:638 real call to socket.connect() for ('egress.ps7.internal', 3128) 150s WARNING httpretty.core:core.py:725 httpretty.core.socket("egress.ps7.internal:3128").real_sendall(280 bytes) to ://localhost:8080http://localhost:8080/pdb/query/v4/nodes?query=%5B%22%3D%22%2C%22certname%22%2C%22greenserver.vm%22 via GET at 1750344359.9218235 150s ERROR pypuppetdb.api.base:base.py:454 Could not reach PuppetDB on localhost:8080 over HTTP. 150s =========================== short test summary info ============================ 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_setting_headers_without_token 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_setting_headers_with_token 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_with_path - reque... 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_with_url_path - r... 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_with_password_authorization 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_with_token_authorization 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_with_query - requ... 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_with_order - requ... 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_with_limit - requ... 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_with_include_total 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_with_offset - req... 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_with_summarize_by 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_with_count_by - r... 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_with_count_filter 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_with_payload_get 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_with_payload_post 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_response_empty - ... 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_response_x_records 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_query_with_post[string] 150s FAILED tests/test_api_base_query.py::TestBaseAPIQuery::test_query_with_post[QueryBuilder] 150s FAILED tests/test_api_metrics.py::TestMetricsAPI::test_metric_v1 - requests.e... 150s FAILED tests/test_api_metrics.py::TestMetricsAPI::test_metric_v1_list - reque... 150s FAILED tests/test_api_metrics.py::TestMetricsAPI::test_metric_v1_version_constructor 150s FAILED tests/test_api_metrics.py::TestMetricsAPI::test_metric_v2 - requests.e... 150s FAILED tests/test_api_metrics.py::TestMetricsAPI::test_metric_v2_version_constructor 150s FAILED tests/test_api_metrics.py::TestMetricsAPI::test_metric_v2_version_string 150s FAILED tests/test_api_metrics.py::TestMetricsAPI::test_metric_v2_error - requ... 150s FAILED tests/test_api_metrics.py::TestMetricsAPI::test_metric_v2_escape_special_characters 150s FAILED tests/test_api_metrics.py::TestMetricsAPI::test_metric_v2_list - reque... 150s FAILED tests/test_api_other.py::TestCommandAPI::test_command - requests.excep... 150s FAILED tests/test_api_other.py::TestCommandAPI::test_cmd[string] - requests.e... 150s FAILED tests/test_api_other.py::TestCommandAPI::test_cmd[QueryBuilder] - requ... 150s FAILED tests/test_api_other.py::TestCommandAPI::test_cmd_with_token_authorization 150s FAILED tests/test_api_other.py::TestStatusAPI::test_status - requests.excepti... 150s FAILED tests/test_api_pql.py::TestPqlAPI::test_pql_casting - requests.excepti... 150s FAILED tests/test_api_pql.py::TestPqlAPI::test_pql_no_casting - requests.exce... 150s FAILED tests/test_api_query.py::TestQueryAPI::test_facts - requests.exception... 150s FAILED tests/test_api_query.py::TestQueryAPI::test_fact_names - requests.exce... 150s FAILED tests/test_api_query.py::TestQueryAPI::test_environments - requests.ex... 150s FAILED tests/test_api_query.py::TestQueryAPI::test_inventory - requests.excep... 150s FAILED tests/test_api_query.py::TestQueryAPI::test_nodes_single - requests.ex... 150s ======================== 41 failed, 122 passed in 5.55s ======================== 151s autopkgtest [14:46:02]: test unittests: -----------------------] 151s unittests FAIL non-zero exit status 1 151s autopkgtest [14:46:02]: test unittests: - - - - - - - - - - results - - - - - - - - - - 152s autopkgtest [14:46:03]: @@@@@@@@@@@@@@@@@@@@ summary 152s unittests FAIL non-zero exit status 1 156s nova [W] Using flock in prodstack7-ppc64el 156s Creating nova instance adt-questing-ppc64el-pypuppetdb-20250619-144330-juju-7f2275-prod-proposed-migration-environment-21-0e8d970d-a213-4f22-84ec-9e8243d4143e from image adt/ubuntu-questing-ppc64el-server-20250619.img (UUID 1c97422d-c646-492e-9581-3c98f213de4b)... 156s nova [W] Timed out waiting for 65219fc7-943b-4cd3-b09e-c2feedd6971e to get deleted.